Video Crossfading

This seems to be my biggest hurdle so far.  This might be the solution.

Screen Shot 2014-10-19 at 4.29.35 PM

I was able to take a user made crossfader to get two of the videos to fade from one to another.  I need to make a “stop” for the out faded video upon crossfade completion.  And the crossfade needs also work between 3 videos, depending on the amount of people (blobs).

Screen Shot 2014-10-19 at 5.43.47 PM

 

I’ve worked out a 3 way, toggling, video crossfader, triggered by the MaKey MaKey circuit completion.  Now to make it more conditional.

Screen Shot 2014-10-20 at 2.09.36 PM

 

Project Function

Reaction to Legalism.

The project uses the punishment/reward principles of the Chinese philosophy of Legalism.  Collaborative compliance is rewarded, non-collaboration is punished, individualism ignored.

Project Functions:

Ples           Stereotypically pleasant videos and sounds.
UnPles     Stereotypically unpleasant videos and sounds.
Ples+        Brighter, more vibrant pleasant videos and sounds.
MMc          MaKey MaKey circuit completion
PosRef     Affirmation of intended action, positive reinforcement.

  • When no one is around, the project sits dormant, but is always scanning
  • Once one or more people are detected, it triggers the slow transition into Ples
  • If only one person is detected, the state remains in Ples
  • If there are more than one person detected, it triggers a timer.
  • After a specified length of time, set to the timer, there is a slow crossfade to UnPles
  • If MMc, before UnPles, then the slow crossfade to Ples+
  • If MMc after UnPles, then slow crossfade to Ples+
  • Upon MMc, PosRef.

IMG_2432_1

 

Now that all that is down on paper, I can use it as a reference for building the interface.  It works great as a checklist of actors.

Screen Shot 2014-10-19 at 3.39.28 PM

Scene Crossfade

This is more of a note-take for me, than any sort of tutorial.  I was able to get the scene to crossfade on the expiration of a timer.  I also did a bit of housecleaning.

 

Screen Shot 2014-10-18 at 2.40.09 PM

Isadora Blob Counting

I was able to get the camera to see when there was two blobs, and then start playing a video and separate track of audio, after a certain amount of time elapsed.  Here’s the screen shot.

Screen Shot 2014-10-14 at 2.27.33 PM

 

I need to have it play something immediately once two blobs have been found, and then crossfade to another track of video and sound once a period of time has elapsed, unless two keyboard triggers have been registered simultaneously.

 

 

Legalism Reactions

Initial Idea:  Forced cooperation for the betterment of the group.  Viewers must both engage in a task, such as putting their hands on a surface.  If they both don’t comply, the experience is designed to punish the participants by creating a negative atmosphere using sound and video.  When they comply, they are rewarded with soothing sounds and video imagery.

Screen Shot 2014-10-07 at 1.56.22 PM

There would be a MakeyMakey board that would serve as the physical interface that much be used by the pair of participants.  The experience would overall be controlled by Isadora to “see’ that there are participants present.

Screen Shot 2014-10-07 at 2.44.18 PM

 


This or That

I’ve been torn between my website running WordPress, and ditching it and only using Tumblr.  I can’t help but want to keep what I control, my website, on my server space.  So, I’m going to try both, and see which one gets more attention and interaction.  I’m going to use a couple widgets that make sure that the same posts are on both platforms.  Lets hope it works as advertised.

Diskinected

It seems my project has me on the ropes.  More work is going into executing a technique than has gone into the idea behind the project.  That’s a position I seem to get myself into too often.  It’s also the driving force behind my lack of coding language-based projects over the last couple of years.

My new line of inquiry sits here for the moment.

KinectCoreVision

KinectCoreVision

 

KinectCoreVisio can be found here: https://github.com/patriciogonzalezvivo/KinectCoreVision

Izzy Kinect Quartz

The rabbit hole. As I follow down the path of others, I’ve come to another obstacle to cross. Synapse for Quartz Composer. And in order to use it will require me to get familiar with Xcode. I’ve avoided coding for the last 8 years, but I guess that’s going to have to come to an end if I want to make the kind of interfaces that I have in my head.

It’s a bit of a whirlwind, and following of directions that I don’t entirely understand. But, you just have to dive in right?

Screen Shot 2014-09-30 at 8.58.56 AM

Screen Shot 2014-09-30 at 9.51.24 AM

Setback!
Screen Shot 2014-09-30 at 9.58.27 AM

Now to figure out why Synapse crashes on startup…

It seems the program’s author only shows compatibility up to a certain version of OSX.  I’m going to try using my old Macbook which is using an older version of OSX.  Fingers crossed.

It seems my old laptop is too old.  It requires legacy versions of assorted software, that have eluded my searches.  All of which cannot be assured of successfully working if found.  Too much time suck without a firm promise of a payoff.  It’s time to rethink my strategy here.

Digging Again

I had the privilege of sitting in a class, while the teacher taught the class about remix culture.  Where it comes from, historical references, etc.  It was a surreal experience to have my memories taught to a class, that I’m sitting in.  My brain immediately turned toward music, and then out of some weird knee-jerk reaction, I turned against me first instinct.  I started thinking of a 3d piece that fit the requirements of the project, without fully discarding my musical attachment.  I was drawn to the idea of bringing the sampling tradition in hip-hop into a physical world.  I played around with the idea of dissecting a sample-based song to determine the percentage each sample was of the song’s whole.  I would then cut out pie shapes from the records that the samples came from, in the corresponding percentage of a whole record.  The next part of the project was to find a song.

This became a project changer, as the records from most sample-based songs have become expensive due to their collectibility.  As a solution to this problem, I realized that my first instinct would make for a more original work, and help push me to make music.  Using records I have laying around that I wouldn’t be sad about cutting up, I sampled selections from 3 records.

The records that were sampled were chosen at random from my yard sale collection. They were all in poor condition, scratched, dusty, stored for decades without dust covers. I made no effort to repair the sound. I wanted it crunchy. I recorded the parts I was interested in, and laid them out in Audition. Conveniently this is how I started making music in the first place, back in 2002. I used to use Cool Edit Pro, which was later bought by Adobe, and then called Audition. It felt natural to use it for this kind of work.

For the arrangement of the song, I allowed the elements of the some to simply present themselves, without any effects, and only minor EQ’ing. I kept the arrangement simple, until the finish of the 1st part. The 2nd part introduces the use of timestretching and stereo field manipulation to pull the sound apart into the listening space. I do not normalize my tracks, as I love a dynamic range of volume. The whole some serves to set up the 3 part. This is where I take each sample and stretch it excessively to create an entirely new feel, while maintaining a subtle connection to the rest of the song.

Screen Shot 2014-10-01 at 10.48.18 PMThe secondary intent of the work is to take the mood from slow and lighthearted, to a more aggressively introspective one.

Critique for this was a bit more complicated that I initially thought.  I was caught in the idea of making, more so than how it would be received.  Although, I did think that music would be tough to crit, as many social factions define themselves using music more than any other cultural aspect.  It quickly dawned on me that it was being presented where all other projects were visually driven.  At least half of the critique was on the visual presentation of the music, ie. how the speakers sat on the table, the table itself, and the fact that I chose to set the records sampled upright between the speakers.  One crit even included commentary on the style of the album covers.

From this I took away a couple things.  Next time, I’ll use a couple pedestals, and turn off the lights, to focus the attention on the aural display.  The other take away is on generational, and genre considerations; it can’t be pandered to, and cannot be avoided.  If I’m going to make music, it is to be for myself, without compromise.  And there will be people that wont like it.

KinectA

I’m now working on a new (to me) computer.  This means some things have to be done all over again.  That includes installing the needed drivers and software for connecting my Kinect.

I started my research with this article explaining how to get things up and running:

http://createdigitalmotion.com/2013/09/some-easy-ways-to-get-kinect-controlling-music-visuals-on-mac-and-windows/

Screen Shot 2014-09-29 at 10.47.03 AM

The process as already gotten easier since I last installed it last year.  It no longer requires command prompt use.  Progress.  The application’s interface allows you to control the distance from the camera that you’d like to focus on.  This makes it much easier to isolate unwanted movement.
Test shot of my office, using the Kinect's infrared camera.

Test shot of my office, using the Kinect’s infrared camera.

Now to get started on getting Isadora to see the Kinect…

I’m currently pouring over the TroikaTronix Forums, as there seems to be several people that have already worked out the process.http://troikatronix.com/troikatronixforum/discussion/93/kinect-driver/p1

Izzy Door A

Isadora it seems, is a program that fills in a lot of holes of possibilities in my work.  I’ve frequently had very fragmented ideas of combining scripted function, with the use of multiple forms of media.  This could get really exciting.  It has gotten exciting.

Screen Shot 2014-09-25 at 2.07.14 PM

This set uses the eq of a .m4v (quicktime audio) file, to control several aspects on the project.

 

I’m still sorting out the simple things, like starting and stopping the video without having to mouse around and click.  Something more like a key stroke.  Basic controls really.  I’m finding that there’s no shortage of information, and it seems almost everything has been done before.  Once I’ve picked up steam, my endgame is going to be something related to a Kinect, controlling the manipulation, and playback of a video.

Screen Shot 2014-09-26 at 6.47.45 PM

 

Baby steps.

My start stop playback setup

My start stop playback setup