About Us
Recent Distributions
What's New
Search in
Publications Archive
Calendar Other Sites Glossary Sitemap
Contact Us Frequently Asked Questions Latest VCE Bulletin
Media Releases Publications
Official Forms
Publications
VCAA Home Victorian Certificate of EducationVocational Education and TrainingCurriculum and Standards FrameworkAchievement Improvement MonitorPreparatory to Year 10For ParentsFor TeachersFor Students

VCE Report for Teachers

Information Processing and Management 3: Written examination

GENERAL COMMENTS

The Information Processing and Management examination for 2000 marked the introduction of a new examination format. The paper consisted of fourteen separate questions. The advantage of this approach is that students who did not understand the details of a particular scenario within a question were still able to score well on the rest of the paper.

The spread of students’ scores was wide, with the lower scores pertaining to students who either did not address a question or failed to relate their answer to the stimulus material in a question. The greatest spread of scores occurred on Questions 10–14, where students were required to explain or justify their answers. Many students provided very short answers and were unable to acknowledge in their answers, the circumstances of the scenario in the questions.

Teachers had obviously prepared students well in relation to the Internet and web publishing as Question 4 was extremely well done. Very few students obtained less than 3 marks out of 4 for their answers. On the other hand many students incorrectly made the assumption that if software questions were asked, such as Question 3, then the two software tools that they had studied had to be used in the answers. Students at this level are expected to have a general understanding of the purposes of the more common software tools.

Generally, students should be urged to write longer responses when asked in a question to ‘explain’ or ‘justify’. Many students did not provide enough detail to obtain full marks. For example, students would be expected to provide more than a one-sentence response to a question worth 4 marks that warranted an explanation or justification.

Most students coped well with the choices offered in the paper and selected the appropriate number of parts to answer. However, a small number of students answered all parts of optional questions.

SPECIFIC INFORMATION

Responses to all parts of a question are provided, even if the students were required to select an appropriate number of parts. The answers provided are not exhaustive, and they reflect the expected responses or more common ones provided by students.

Question 1

From the provided list of three roles, the expected responses were:

Position

Role

operational managers

implementation plans

senior management

long-term planning

Many students only scored 1 mark as they did not pick up that a manager’s role is planning, and incorrectly selected ‘daily tasks’ for the operational manager’s role.

Question 2

a.

Students found this question challenging and frequently attempted to draw an organisation chart. This type of response was not awarded any marks. An acceptable response was:

b.

Naming convention

Any convention that was suitably able to differentiate the letters was accepted such as recipient name and date. Some students confused naming convention with formatting conventions for letters and subsequently suggested the convention of ‘full block letter’.

Question 3

Acceptable responses are provided for each case study (students were only required to answer two case studies).

Case study A

Software tool: desktop publishing (DTP), word processing (WP) or specific package name, for example Microsoft Word.
Reason: Notification of a meeting would typically be in the format of either:

  • a flyer (done most effectively using DTP) with text and images
  • a letter (done most effectively using WP) applying the mail merge facility.

If students nominated database software they only scored marks if they explained how the database was used to merge with a letter or a report. Email and web packages were not awarded marks since not everyone has Internet access.

Case study B

Software tool: project management, database or spreadsheet (used as a database/or Gantt chart) or specific package name, for example Microsoft Project or Inspiration.
Reason: The coordinators need to create lists of people, which is done most efficiently using a database.

Since a timeline for completing tasks is needed, project management tools allow both the tasks to be listed and a timeline to be identified.

Other answers were accepted where the student was able to clearly explain how the software produced a planning document.

Case study C

Software tool: spreadsheet or database or finance/accounting packages.
Reason: Automatic recalculation when figures are changed calculations are executed when different rates are entered.

Question 4

This question was extremely well answered by students. Clearly the choices made this question accessible for all students and the subject matter had been well covered by all teachers. Students obtained full marks if suitable explanations were provided; however, if a generic answer such as efficiency or effectiveness was provided, then no mark was allocated.

Consistent placement of navigation buttons

  • ease of use for visitors to the site
  • people can quickly locate the button to go to a preferred page from anywhere.

Graphics 30k

  • page loads in an acceptable time (quickly)
  • users not waiting a long time to view page.

Videos selectable option

  • large files take too long to load
  • not all users will want to see the video.

Size of file is given for downloading

  • user knows how much space is required
  • user is able to estimate time for download.

Page to fit 800 x 600 resolution

  • user not required to scroll
  • quality of presentation maintained
  • most monitors have 800 x 600 resolution or higher.

Underlining is NOT used

  • underlining is for hyperlinks (convention)
  • user could be confused when underlined text does not go anywhere.

Question 5

While this question was generally well handled two aspects do need to be considered. Firstly a small number of students confused the words software and hardware and secondly a minority of students were unable to adequately explain the function of the item, preferring instead to describe the hardware.

Software:

  • browser (Netscape Communicator, Internet Explorer)
  • dial-up software, modem software, TCP/IP software
  • operating system (Windows).

Hardware:

  • modem to convert digital signals to analog and vice versa that enable messages to travel over the phone network
  • cable/telephone line that enables packets to pass between connected computers
  • any other items that are used for which the student provided a clear explanation such as monitor, computer or even a phone jack as it allows information to pass through the telephone cable.

Question 6

This question was misinterpreted by a large number of students who believed it meant backup. Clearly backup procedures have been very thoroughly covered by teachers, and students came up with very sound strategies. However, no marks were awarded for this response. Similarly very few students related their answer to a software tool as indicated in the question. An acceptable response was:

Method: Print Preview (Preview in browser, page layout view). Students were expected to relate the method to particular software such as using Datasheet view in Microsoft Access.

Purpose:

  • check that output fits within specified margins
  • check that output is visually balanced
  • check that appearance is correct before printing to paper.

As very few students gave the expected response, marks were also awarded for correct definitions of soft copy.

next page

 


Copyright © 2002 Victorian Curriculum and Assessment Authority
Last Updated: Tuesday, 12 February, 2002