Download Assessment Schedule

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Music technology wikipedia , lookup

MiniDisc wikipedia , lookup

Music technology (electronic and digital) wikipedia , lookup

Transcript
Assessment resource unit standard 27658
Assessment schedule: Bills bounce
Task
Evidence/Judgements for achievement
The MIDI sequence is completed for a
minimum of electric piano and drum-kit, is a
minimum of 12 bars, is musically convincing
and has used the range of processes,
features and functions of the application.
Evidence/Judgements for achievement
with merit
The MIDI sequence is completed for a
minimum of electric piano and drum-kit, is a
minimum of 12 bars, is musically convincing
and has used the range of processes,
features and functions of the application.
Evidence/Judgements for achievement
with excellence
The MIDI sequence is completed for a
minimum of electric piano and drum-kit, is a
minimum of 12 bars, is musically convincing
and has used the range of processes,
features and functions of the application.
One
The student is able to describe MIDI data,
audio data and meta data according to their
type and specification.
The student is able to describe MIDI data,
audio data and meta data according to their
type and specification.
The student is able to describe MIDI data,
audio data and meta data according to their
type and specification.
The student is able to demonstrate and
apply knowledge of electronic music
production processes using sequencing
application(s) according to their
documented specifications.
The student is able to demonstrate
integration of knowledge between the
processes and the features and functions of
the application(s) used to assemble the
sequence.
The student is able to demonstrate a high
level of integration of knowledge between
processes and the features and functions of
the application(s) used to assemble the
sequence.
The student uses technical language to
describe the processes, features and
functions, data types.
The student uses a range of technical
language confidently and accurately to
describe the processes, features and
functions and data types.
Examples of student responses:
Processes: recording, capturing, editing,
mixing, playback, bounce, store.
Features/functions: track, click, region, add
effect(s).
Examples of student responses:
Processes: recording, capturing, editing,
mixing, playback, bounce, store.
Features/functions: track, click, region, add
effect(s).
How did you input the notes?
Before recording my sequence, what I need
to do is use the features and functions of the
programme to complete my task, to make a
How did you input the notes?
The process I followed was to boot the
sequencer application, the sequencer file set
up with having created tracks in the arrange
Examples of student responses:
Processes: recording, capturing, editing,
mixing, playback, bounce, store.
Features/functions: track, click, region, add
effect(s).
How did you input the notes?
To make the sequence, I used the features
and functions of the programme, where the
equipment is plugged in and going. I started
 New Zealand Qualifications Authority 2017
Assessment resource unit standard 27658
the sequencer software, make a file with
tracks, using the 2 tracks required, the
electric piano and drums. I saved my file to
my folder. To make music the tracks are set
to record. I can put it in by step one note at a
time, choosing rhythm and pitch. I used the
instruments on the tracks so I could hear
each one in my headphones when I used
the track. I was told I needed to change the
volume of the track at times.
Using the keyboard and the Mac I put in the
notes by step one by one. The KB and Mac
were connected and on. It came up on the
screen when I put the notes in. I had
mistakes but the software made takes, so I
kept going from where I was up to. The MIDI
data from the KB put in by step had the
notes by name, with the rhythms by length.
These rhythms were in numbers, less for
short, more for long. The rhythm showed
with time grid of the beats in the bars. The
dynamics were 0 is not heard and 127 is
loudest.
sequence, where the equipment should be
plugged in and going well. The things I did
was to boot the sequencer software, set up
the file with by making tracks in the arrange
window, choosing my 2 tracks of electric
piano and drums for the instruments, saving
my file with my name on it & into the student
work folder. To enter my music, I armed the
tracks to record with the record button on
each track. If I used step entry I chose the
notes rhythm as I went along and tried to put
each on the right pitch. I needed to make
sure the right instrument for each track was
heard when I used that track, so the sound
was in my headphones. Sometimes I
needed to change the volume of the track.
window, allocating timbres (from the sound
library) as required for my 2 tracks of electric
piano and drums, saving my file with an
identifiable name & into the correct digital
storage location. To be ready to enter my
music, the sequencer tracks would need to be
ready as record armed the each track (the red
“R” box shaped button on the track
parameters, or set the duration(s) for step
entry(s). The correct timbre(s) were sounding,
the audio signal was routed to appear in my
specified monitors/speakers/headphones and
that I could adjust the levels as required (at
the track level, at the amplification level).
To enter my music material by input method
the USB MIDI keyboard & AppleMac needed
To put in the music in step by step on the
to have correct connections for full function,
MIDI keyboard with the Mac software, they
ensuring the USB MIDI cable connecting the
were connected and working and on, using
MIDI KB to the AppleMac, with the USB MIDI
the USB cable. I played MIDI KB and knew it KB being on at the on/off switch. As I played
was working ok when I could hear it as I
keys on the USB MIDI KB there should be
played it and it came up on the screen. I
confirmation of the MIDI signal being
used the click to make a click per beat and a generated at the KB and was showing up in
clack per bar. Sometimes I made mistakes
the MIDI receive in the transport window,
When you mixed your sequence, how did and needed to complete the recording of the where the duration and pitch flash up). Or a
you alter it?
whole track, but the application made a
MIDI guitar could be used to capture the
To edit with the functions and features to fix region of that take, so I could start from
music data. To ensure I remain in time I will
my mistakes, I had played from the start and where I was up to. The MIDI data made by
turn the metronome click to on that will sound
fixed where I found mistakes, clicking on the inputting from the KB (live or by step) had
a click per beat and a clack per bar. If I record
note in to drag it up or down, or dragging it
the pitch named as I played it, and the
less than the whole track in one take that is ok
longer or shorter.
rhythm made from the length of each note
the application creates a region that defines
played, giving them pitch names and
that take. I can copy/paste or join (merge)
When it was all right for the notes I used the numbers (higher numbers are higher pitch). regions at will. Both of these create by
mixer, using the volumes on the tracks to
The rhythm was made up of the length of
recording the MIDI data that contains pitch (as
get a mix fitting the sounds together. I tried
time I played each note, where a crotchet
pitch named from middle C as C4, so an
effects like reverb so it is like the sounds if
had 240 clicks, so less for quaver and more octave higher is C5), duration (by length as
 New Zealand Qualifications Authority 2017
Assessment resource unit standard 27658
they bounce around the walls. Delay was
echoing, but the echo needed to in time.
MIDI data was pitch, rhythm, and timing.
Audio data is the audio sounds, meta data is
other data for switching on a new instrument
sound changing the mixing desk volumes by
itself, or to make the drums quieter or
louder. When I’d done my mix I made a
mixdown of all the tracks together.
for minims etc. The rhythm can be seen in
time in the time grid scale in the arrange
window, which was set to beats/bars. The
dynamics is called velocity where 0 is not
heard and 127 is loudest.
whole or part fraction of 240 clicks per
crotchet/quarter duration), placement (in time
in the pulse/beat/bar, relating to the time grid
scale along the top of the arrange, which I set
to beats/bars, time in seconds, as samples, or
SMPTE frames as I want), as well as
When you mixed your sequence, how did dynamics (velocity [as for most data] is of a
you alter it?
range of 0 (lowest) -127 (highest), or 1-128 for
some American/Japanese machines).
Other functions and features of editing
helped me fix mistakes, if it was the pitch or Using further functions and features, it is
rhythm. I played it from the start or where I
necessary to check the entered music
wanted, to fix where I made mistakes. I can material for correctness of pitch/rhythm/tempo
click on the note in the matrix to drag the
etc by playing it back (playback, where I can
pitch up or down. If the rhythm was wrong I
toggle to start of sequence, or start from a
might try rerecording it to get it right or
chosen place, or loop a preferred section),
dragging the end to make it longer or
identifying errors by beat/bar, before editing
shorter.
my music material from these error checks, to
correct any errors of pitch/rhythm/tempo etc.
When it was all right for the notes I can
For each correction, the note is identified in
make my mix in the mixer, changing the
the matrix window and I can double click on it
levels on each track to get a good mix of
to alter the parameters, pitch being described
the sounds together, making the sounds fit
numerically(C0 is low pitch, C7 is high etc),in
well together or stand out when necessary.
the scale order A to G cyclic, accounting for
I also add effects to give more interest to
accidentals as required (# or b), with the
my mix. Reverb acts like it would bounce
rhythm consisting of the duration (lengths by
from the walls of different rooms,
240 clicks per quarter duration) and
bigger/smaller or brighter or duller
placement by beat (1-4 if in 4/4) and bar (1-12
depending on what the walls were made of. for this sequence).
Delay is as if the sound was echoing, and
When you mixed your sequence, how did
the rate of the delay needs to be in time
you alter it?
with the music, so the echo doesn’t repeat
out of time. MIDI data includes things like
When my sequence has the correct content I
pitch, duration, or bar placement. Audio
can go into the mixer and set the level per
data is the audio recording as computer
track to create a balanced sound, with
data, meta data is extra data that would
prominence to each part as required. Here I
controls things like changes of instrument
can also add effects to give more depth and
 New Zealand Qualifications Authority 2017
Assessment resource unit standard 27658
Two
A stylistically consistent score is completed
for a minimum of 12 bars with a minimum of
six of the following musical elements
completed accurately:
 pitches, including accidentals
 rests
 rhythms
sound or automatically adjusted the mixing
desk volumes, like if the drums were quieter
at the start, then louder in the middle. When
I’d completed my mix I made a mixdown
that is a bounce where all the tracks are
mixed together into a stereo track.
interest to my mix. Reverb can give a track a
greater warmth to the sound, as it would
reflect form the walls of bigger, smaller or
more/less reflective/absorbant/refractive
surfaces. Delay could add interest as if the
sound were having a good amount of
echo/repeat, but I need to be careful to ensure
the rate of the delay matches the pulse/beat
of the music, so it doesn’t repeat out of time.
MIDI data includes all the above such as
pitch, duration, bar placement. Audio data
would be the data of an audio track if I were to
add one, meta data would be additional data
that would often act as a “controller”, for
example I could get the electric piano to
change half way through from an old Rhodes
to a modern phasing Wurlitzer. Or I could
apply level automation to the drum track to get
the mixer to automatically turn the drums
down at the start, to ensure they are not too
over powering, then up a bit in the middle
where the music crescendos. When I’d
completed my mix I made a bounce where all
the tracks are mixed together into a stereo
track, where I used the file > bounce function,
setting the start and end markers and the
resultant file format.
A stylistically consistent score is completed
for a minimum of 12 bars with a minimum of
six of the following musical elements
completed accurately:
 pitches, including accidentals
 rests
 rhythms
A stylistically consistent score is completed
for a minimum of 12 bars with a minimum of
six of the following musical elements
completed accurately:
 pitches, including accidentals
 rests
 rhythms
 New Zealand Qualifications Authority 2017
Assessment resource unit standard 27658





chord indications
tempo or metronome marking
feel
repeat sign
anacrusis





chord indications
tempo or metronome marking
feel
repeat sign
anacrusis





chord indications
tempo or metronome marking
feel
repeat sign
anacrusis
The student is able to demonstrate and
apply knowledge of music notation
application(s) by creating a notated score
according to documented applications
specifications.
The student is able to demonstrate
integration of knowledge between the
purpose of the processes and the features
and functions of the application(s) used to
recreate the score.
The student is able to demonstrate a high
level of integration of knowledge between the
purpose of the processes and the features
and functions of the application(s) used to
recreate the score.
Examples of student responses:
Processes: layout, recording, capturing,
editing, mixing, playback, bounce, store.
Features/functions: track, click, region, add
effect(s).
Data types & specs: MIDI data, audio data,
Meta data.
Examples of student responses:
Processes: layout, recording, capturing,
editing, mixing, playback, bounce, store.
Features/functions: track, click, region, add
effect(s).
Data types & specs: MIDI data, audio data,
Meta data.
Examples of student responses:
Processes: layout, recording, capturing,
editing, mixing, playback, bounce, store.
Features/functions: track, click, region, add
effect(s).
Data types & specs: MIDI data, audio data,
Meta data.
To make the score, I used the features and
functions of the programme, where the
equipment is plugged in and going. I started
the notation software, making a score with
staves, using the 2 staves required, the
electric piano and drums. I saved my file to
my folder. To make music the staves are set
to record in the mixer or I can put it the
notes in by step one note at a time,
choosing rhythm and pitch with the mouse. I
used the instruments on the staves so I
could hear each one in my headphones
when I used the stave. I was told I needed to
change the dynamics of the stave at times.
Before entering my score, what I need to do
is use the features and functions of the
programme to complete my task, to make a
score, where the equipment should be
plugged in and going well. The things I did
was to boot the notation application
software, set up the file with by making
staves in the playback window, choosing
my 2 staves of electric piano and drums for
the instruments, saving my file with my
name on it & into the student work folder.
To enter my music, I armed the mouse to
enter the notes by selecting first the rhythm
then putting it onto the pitch. I used step
entry by choosing the notes rhythm as I
Prior to entry of the music pitch and duration,
my processes use the features and functions
to meet the needs of my outcome, to create a
score, where the equipment should be set up
and functioning correctly to allow my tasking.
The process I followed was to boot the
notation application from the start menu or
desktop, the score file set up with having
created staves in the create instruments
window, allocating instrument timbres from
the sound library per grouped instrument
sections as required for my 2 staves of
electric piano and drums, saving my file with
an identifiable name & into the correct digital
storage location. To be ready to enter my
 New Zealand Qualifications Authority 2017
Assessment resource unit standard 27658
Using the mouse and the PC I put in the
notes one by one. The KB and PC were
connected and on but I found it easier with
the mouse. The note came up on the screen
when I clicked the notes into the stave. I had
mistakes but the software put the note in, so
I kept going from where I was up to and
could come back to it later. The MIDI data
from the mouse put in one at a time was the
notes on a line or space, with the notes
rhythms from what I chose. These rhythms
were in notes, semiquaver or quaver for
short, crotchet, minim or semibreve for
longer. The rhythm showed on the stave in
the beats in the bars. The dynamics were
using Italian.
went through the music to put in and tried to
put each on the right pitch on the staves
lines or spaces. I needed to make sure the
right instrument for each stave was heard
when I used that track, so the sound was in
my headphones.
To put in the music in by step using the
mouse with the PC notation software, they
were connected and working and on, the
mouse using a USB cable connection. I
placed notes onto the stave and I knew it
was working ok when I could hear it as I
entered it and it came up on the stave.
Sometimes I made mistakes, but the
application made the notes on the stave, so
I could keep working from where I was up
To edit with the functions and features to fix to. The MIDI data made by inputting from
my mistakes, I had played from the start and the mouse by step (the live keyboard
fixed where I found mistakes, clicking on the playing had some lag behind which was
note in to drag it up or down, or clicking on it offputting) had the pitch put on the stave
then a longer or shorter note from the
where I put it, after selecting the rhythm
palette.
from the notes palette. The rhythm was
made up of the note one after another. The
When it was all right for the notes I used the rhythm can be seen left to right on each
mixer, using the volumes on the staves to
stave in the playback window, which was
get a mix fitting the sounds together. I tried
divided up already with bars. These
effects like reverb so it is like the sounds if
rhythms were in notes, semiquaver or
they bounce around the walls. MIDI data is
quaver for short, crotchet for a medium
pitch and rhythm of the notation, meta data
length, minim or semibreve for longer. The
is other data for a title and that sort of thing. dynamics were added as I went along using
the create > text > expression functions,
using Italian terms.
Other functions and features of editing
helped me fix mistakes, if it was the pitch or
rhythm, I played it from the start or where I
music, after the score staves were created,
using the mouse to select first the duration
from the floating notes palette (that also
includes 2nd or more parts, other notation
features such as pause etc) to select the
duration(s) for step entry(s), then placing that
duration onto the stave, in the correct line or
space or ledger line for that clef. The correct
instrument timbre(s) were sounding, the
audio signal of the stave was connected to
appear in my specified headphones (this is
set up at the software installation) and that I
could adjust the levels as required using the
stave level in the mixer, or at the overall PC
volume output level.
To enter my music material by step input
method using the mouse with Sibelius, both
needing to have correct connections for full
function, ensuring the USB mouse cable is
connecting the mouse to the PC’s USB input,
and the PC powered up (with the power
plugged into the wall power socket) with the
monitor and tower. As I chose the durations
in the notes palette using the USB mouse
there is be confirmation of the MIDI signal
being generated from the mouse as I entered
the pitch onto the stave as it showed up on
the stave, where the duration and pitch are
together as one note. Or a MIDI KB could be
used to play the music data but I found that
the latency detracted from the functionality of
this method. If I used flexitime input (live KB
playing) I would ensure I entered in time by
turning the metronome click to on that will
sound a click per beat and a clack per bar,
turning the click on at the button on the
 New Zealand Qualifications Authority 2017
Assessment resource unit standard 27658
wanted, to fix where I made mistakes. I can
click on the note in the playback window to
drag the pitch up or down. If the rhythm was
wrong I selected it then clicked on the
correct note in the note palette.
When it was all right for the notes I can
make my mix using the mixer as well as
dynamic markings, changing the levels on
each stave to get a good mix of the sounds
together, making the sounds fit well
together or stand out when necessary
where the dynamics were placed and
attached to notes. MIDI data includes things
like pitch, duration, or bar placement. Meta
data is extra data that would be things like
title, composer.
transport with the picture of the metronome
on it. I could change the metronome click
options by using the really full reference
manual that was available in the help
function menu (page 43 for metronome). If I
used step entry (one note at a time with the
mouse), the application creates a note that
defines each duration I click into the stave.
These rhythms were in durations,
semiquaver the shortest I used (at a quarter
of a count), or quaver for short (half a count),
crotchet for a medium length 1 count, minim
for 2 counts long or semibreve for longer at 4
counts. I could use dotted durations to add
half the duration again. Both of these
methods create the MIDI data that is shown
as the notes on the stave, each containing
the pitch as identified from the placement
onto the stave, on the correct line, space or
ledger line and the duration type chosen from
the note palette, these relate to the bars
preset out in each stave.
Using further functions and features, it is
necessary to check the entered music
material for correctness of
pitch/rhythm/tempo etc by playing it back
(playback, where I can toggle to start of
sequence, or start from a chosen place by
clicking on a bar or note to start from),
identifying errors by beat/bar, before editing
my music material from these error checks,
to correct any errors of pitch/rhythm/tempo
etc. For each correction, the note is identified
in the playback window on each stave and I
can click on it to alter the pitch (drag up or
down or use the up/down arrows).
 New Zealand Qualifications Authority 2017
Assessment resource unit standard 27658
When my score has the correct content on
each stave including the dynamics, if
necessary I can go into the mixer and set the
level per track to create a balanced sound,
with prominence to each part as required and
overall fidelity. Here I can also add insert
effects to give more depth and interest to my
mix. Reverb can give a track a greater
warmth to the sound, as it would reflect from
the walls of bigger, smaller (more or less of a
full reverb warmth) or more/less
reflective/absorbent/refractive surfaces
(brighter or more leaden in sound). MIDI data
includes the musical instrument digital
interface notes such as pitch, duration, bar
placement. Meta data would be additional
data that could include title, composer,
section names. I could make an audio data
stereo bounce track, from the application
function file> export> audio, where the
application makes a *.wav file of the score
form the start to finish (I need to ensure the
file is set to the start of the piece and the
correct instruments are set up in the mixer
(the instrument type that allows recording,
some older versions used kontakt.
 New Zealand Qualifications Authority 2017