Tuesday, February 17, 2015

Class notes 2/17: pitch tracking

Pitch tracking techniques

Auto-correlation: if there's a basically repeating waveform, computer can detect when the differences between samples and past samples are minimal (when it's basically repeating), and so computer can figure out period of repeating sound, will use that period to calculate a frequency--not too much worry about time delay, probably how guitar tuners work--not necessarily fast enough for performance

Spectral analysis: fiddle~ and sigmund~ use this; finds harmonic relationships between peaks in frequency spectrum, uses that to determine fundamental frequency

Sigmund is best at outputting note and volume information, with arguments "notes" and "env."  What you do with that information is up to you.

It misses notes if you play fast, and it doesn't like multiple pitches at once.

If there's a danger of it picking up its own sound or an ensemble member's sound, Chris measures threshold (sound of audience, other players), and sets it so that sigmund~ only pays attention to sound above that threshold.  It's also a good idea to set sigmund to ignore sounds that are below the natural range of your instrument.

peakamp~ object: reports loudest amplitude in amount of time you set--can set it so that if peak amplitude is below a certain level, ignore it

*on Thursday, stay tuned for talking about transfer functions for mapping domains onto a third, "bridging" domain

Monday, February 16, 2015

Final Project, more concrete ideas & timeline

For my final project for the quarter, I'll write a piece for solo saxophone and Max for Live.  I'll use pitch tracking in Max to trigger certain events, and I'll use Ableton's looping and recording capabilities to capture and play back material from earlier in the piece.  Ideally, I'll figure out how to do this automatically, without having to hit buttons during performance.  If all else fails, I can still hit keys on the laptop, or borrow a foot pedal.

Tech-wise, I'll need my computer (as long as it's fast enough), a microphone, and an amplifier or PA system.  I'd prefer an amp, since I could put it behind me and be able to hear my own music.

Knowledge-wise, I need to get familiar with sigmund~ and/or other pitch tracking strategies in Max.  Since I just got Live, I'll have to dig in to its recording, playback, and looping capabilities.  Once I'm more familiar with these features, I'll have a better idea of how to compose the piece.  I'd like it to have more and less rhythmically driven sections, so I can get an idea of how to work in both those realms.

Here's a vague timeline:

Week 7: test patches with sigmund~ etc.
test Live's looping capabilities
start composition

Week 8: complete draft of the piece

Week 9: testing and revision

Week 10: piece ready to perform!  (??!)

Thursday, February 12, 2015

Class notes 2/12: critique of movie patches, animation basics

*look at t i b i b / trigger object-what is it (?)

To synch audio with movie, trigger "open audio file" and playing the movie at the same time.

It's more computationally intensive for Max to draw video in jit.pwindow than if it's in a separate window, especially if it's being resized.

When you use jit.movie, use autostart attribute, set it to zero.  Type jit.movie @autostart 0.

Quicktime has its own time units, hence the complicated way of getting the time in a movie per the example on the website.

qmetro object--Chris recommends for video, puts them on low priority queue--we notice dropped frames less than we notice audio dropouts

*read article about timing Chris posted

jit.rota object--allows you to rotate video--anchor points specify point around which entire movie rotates--so you might accidentally flip it off the screen

Mapping dimensions of one kind of art into another is a whole interesting area.  For ex), Anthony's interest in correlating lights (color, brightness) with musical harmony (pitches, their relationships).  Keep in mind linear mapping isn't the way people perceive things--we tend to perceive things logarithmically.

jit.lcd object--a place where you can draw stuff, creates a matrix & understands certain messages about what to draw--type jit.lcd 4 char 320 240   (4 plane means RGBalpha)--send it to a jit.window to show whatever gets sent out of LCD object--into that, loadbang font, background color (brgb), foreground color (frgb) to be all 255, which ends up white--send message "moveto", prepend "write," and send it to LCD

read a coll into the LCD, which can tell the LCD a series of things to do

can add LCD to movie with a jit.plus or jit.+ object, and it will throw titles on there

paintoval message--follow it with coordinates you'd like oval to be bound with

Tuesday, February 10, 2015

Class notes 2/10: latency, crossfading

latency: delay, usually from conversion from voltages to numbers & vice versa; done by filling a buffer before performing operation; I/O time is ~3 ms in the best case

crossfading: separate line~ objects fading one sound out as other fades in; can have one signal controlling both amplitudes, check out Chris' mix~

matrix~ object--creates an object with a certain number of inputs & outputs--like a little multichannel mixer

jit.xfade object--if attribute is 0, only left video; if attribute is 1, only right video; if 0.5, equal mix of both videos--can crossfade between videos

Monday, February 9, 2015

Final Project Ideas

Things I definitely want to learn to do in Max, eventually:

-analyze audio input for pitch information, have Max respond based on that:  People have made objects you can download, like sigmund~     **also consider amplitude tracking

-automatically change parameters of sound based on darkness/lightness (alpha values) or colors of movie

-improvise live with a sampler (preloaded samples, effects)  **I'm told UCI doesn't have a sampler to be checked out--also, if there were, I wouldn't be the only person wanting it--I'd have to go ahead and get a SoftStep**    **go to AMC for MIDI controllers w/ appropriate software**    **look into other MIDI pedals/pedal boards**

-design my own loop station for live performance, with a few extra features (pitch shift, time shift)

-solo piece for voice and Ableton/Max

-solo sample piece using only sounds from saxophone



I'll have to see if I can buy Ableton before the quarter ends.  If I buy it soon, I'll look into possibilities for customizing a loop setup, and I can perform with voice and/or saxophone that way.  If I'm unable to buy Ableton soon, I'll work on having Max analyze pitch input.  That would be valuable for future projects.

Problems with Patcher 5

I managed to get my movie to play, and to control it the way I want.  I tried to set up timepoint objects to trigger a series of sounds as the suitcases fall off the conveyor belt, but for some reason, although Max can load the sound files, it doesn't play them.  It definitely doesn't play them in time, as I've indicated with the timepoint objects, but it doesn't seem to play any of them period, except the first one I loaded, which is the background sound.  I hope I can figure it out tomorrow.

*I'm told I need a transport object to get time point objects to work.  That explains part of it.

*I have to figure out polyphony w/ audio files

Tuesday, February 3, 2015

Class notes 2/3: Aiyun Huang concert

patcherargs--put it in a sub patch, append it with arguments, that way the patcher knows how many arguments to expect; when patch loaded, arguments come out as a list; this allows you to use the file in another place with reconfigurable arguments

***connecting gesture to sound in electronic music***  electronic music can be disembodied    ***charisma factor human vs computer***        ***scale factor: human sounds can't get as loud as a speaker***      performance by Aiyun Huang challenges instruments-plus-electronics model of electro-acoustic performance

How do you write a whole piece in Max?
He has 3 strategies:
-standard, formal, sectional structure (using a patch for each one, maybe)
-going through a set of processors (like pedals) all the time, can turn them on and off--serial technique
-parallel technique--everything happening all at once, windowing only some of those things

_______
VISUALS: you need four #'s to express a color on a pixel: red, green, blue, and alpha (transparency/opacity)

37 million numbers per second of video at low, standard-tv resolution