The Wanderbirds!

This blog is dedicated to my Mentor and Friend, Kaj Pindal.

IMG_7947In the fall of 2004 I met Kaj Pindal professionally while working temporarily as the Program Coordinator for the BAA in Animation at Sheridan College.  Although I knew of Kaj and had had him as a guest speaker when I was at Sheridan as a student in the 90’s, I really didn’t get a chance to know him as a colleague in the animation industry.  Everybody in the Canadian industry knew Kaj as the father of the NFB short, Peep and the Big Wide World which started out as an animation test in 1962 called The Peep Show which would eventually become the Emmy Award Winning WGBH series Peep and the Big Wide World.  Although Kaj had had a distinguished career at the then fledgling National Film Board Animation Unit in Montreal and was even nominated for an Oscar in 1967 for the animated short film What on Earth with fellow animator Les Drew, it was his career after the NFB that caught most of the student’s attention.  He had a gift for animating and spent a good time going around the world talking, demonstrating, working and researching with the other great minds and leaders in the industry.

Kaj wandered into my office one day and sat down.  It was his way of introducing himself and getting to know the new faculty and staff.  I remember him distinctly asking what I did in the industry and explaining I was now Producing children’s shows, in which he replied….I might have a couple of ideas.…..in his trademark long Danish accented, Kaj Pindal drawl.  Sure enough, buy March of 2005, Kaj and I had started working on his next project called “The Immigrants”.

Immi_MipJr

 

Kaj envisioned a family of Penguins who lived on the South Pole called the “Guins”.  Their habitat was overcrowded and shrinking with the melting ice cap and an over friendly seagull suggested they make their way up to the North Pole, where food and ice was abundant!  The “Guins” would venture out into this big wide world as newcomers to strange and different lands as they journeyed North to the promised oasis called the North Pole.

camel

Essentially it was a “fish out of water story” with a nice cultural and environmental theme.  In 2005 it was timely due to the environmental concerns being discussed, but when I see the massive refugee crisis that has hit the world in 2015, I’m awed at Kaj’s innate ability to be forward thinking while discreetly pushing major themes within a global context.

WB-Bears

We got to work developing the pitch bible, checking with broadcasters on their needs as well as bringing in production partners for the inevitable animated test that most funders want to see before they put any money up for a full production. One of the major stumbling blocks was the title.  Although being old-school, I knew where Kaj was going with it.  In the 1940’s Kaj had spent his youth in Denmark drawing cartoons of Hitler for the Danish Underground, so he had seen his fair share of hardships and flight but the title had to change as the broadcasters just didn’t like it.

In true Kaj fashion, he went away for a bit to ruminate on the dilemma and came back with The Wanderbirds!  He based his title on a very popular European pre-WWI youth group that emphasized hiking, swimming, camping and travelling to other countries; called the Wandervogel, which had a bird for its emblem.

 

WB-Title

With an excellent title in hand, a good working storyline, all we needed was an animation test.  I had gotten to know the industry leaders in flash animated series, Fatkat Studios in New Brunswick very well.  They were a great bunch of classically trained animators, the people Kaj thrived working with.  With a trip out to see them, Kaj and I had secured an agreement to produce a trailer in hopes they would get further work when/if the project was greenlit.

The trailer was short, but as you can see above, the style and timing was all Kaj.  I placed the trailer into MipCom Jr. in Cannes France during the 2005 MipCom broadcast sales and distribution market.  Along with The Wanderbirds Pitch in hand, we immediately received interest while at the market.  However now that I was an independent producer and not working for Calibre Digital Pictures/Alliance Atlantis anymore, broadcasters were a little risk averse and advised Kaj and I to partner with a larger Executive Producer in order to make the financing work.

Oddly enough we had three Canadian Exec Prod’s interested in representing the project; GalaKids, CCI Entertainment and Shaftesbury Entertainment.  In 2006 we came to a Development IP agreement with Shaftesbury, who then started working on the broadcasters to fund the series.  Over roughly a two year span, Shaftesbury worked hard to try and bring the Wanderbirds to life, but by then the broadcasters had too many other penguin shows.  The IP agreement ran out with Shaftesbury and the property was returned to Kaj and I to try and repitch.  It was 2008 and at the start of the American recession, so the decision was made to shelve the project indefinitely.

This was the first of many development project experiences I had.  I was grateful to Kaj for putting faith in me to at least represent his final series attempt in a professional manner.  Substantial money was spent, as it always is in our business, but in the end the market decides what is fresh and new.

Watching the events unfold around the world now, I think about how the “Guins” would face these struggles.   Hopefully with determination, love and a little ingenuousness as Kaj always envisioned!
WB-Car

Longhouse 3.0.5

Based on all of the great feedback and some excellent research leads, in Stage 3.0.5 of our virtual Iroquoian longhouse project, we look at fur, bark and pole positioning to envision sleeping platform construction within a 3D environment.  There isn’t a considerable amount of reference material available to help guide our visualization process and we will go into further detail later on the visual staging of the interior environment, but we have relied heavily on Dean Snow’s 1997 research entitled The Architecture of Iroquois Longhouses to determine how our interior bunks will be constructed.  We especially wanted to visualize the concept of actual “cubicles” for each sleeping compartment.

Based on European historical accounts, the sleeping platforms that occupied either side of the fire hearths along the interior length of the longhouse were raised 4-5ft from ground level (Snow, 1997). Snow challenges this assumption by citing later 1700’s era European accounts that the sleeping compartments actually consisted of a sleeping level or bottom platform that was 30cm’s (1ft) from the ground and the canopy or storage shelf on top no more than 1.5 -1.8m high or 5-6ft off the ground (1997), with storage for additional firewood and possessions below (Heidenreich, 1972).

Clearly, if we follow previous historical accounts of the sleeping platforms being 4-5ft from ground level, the young and old as well as most adults, would not only have had great difficulty climbing up into a platform of that height but they would have also been exposed to the intense layer of smoke from cooking and heating hearths, making it difficult to breath or see (Sagard, 1939; Smith, Williamson, Fecteau, & Pearce, 1979; JR 10: 91-93). These contested ethnohistorical observations fail to account for seasonal sleeping preferences or even actual longhouse height, which if architecturally higher as Wright suggests, would have greatly reduced the smoke layer well above standing height (1995).

Further, using references from oral history, the common Iroquoian building measurement was ten (Allen & Williams-Shuker, 1998; Kapches, 1993). It was believed to be 1.5 meters in length or equal to the normal size of a body in the sleeping position (Allen & Williams-Shuker, 1998; Kapches, 1993). Dodd discovered based on the archaeological record that the standard range of the sleeping compartments would have been 1.5-2m in depth based on the bunk line pole positions (1984). This assumption would have been supported by French Missionary descriptions of the time and their own general height in the 16th and 17th centuries of 1.6m in size or roughly the same as their Iroquoian hosts (Komlos, 2003). Other’s have suggested, primarily in fictional narratives, that family also slept on the top bunk as well.

Therefore, based on support post positioning within the archaeological record, it is generally accepted that sleeping platforms/family cubicles were generally 1.1-1.8m’s in width, 3.7-4m’s in length and 1.8-2m’s in height. the actual sleeping platform itself has been recorded to be anywhere from 0.30-1m off the ground level with the roof of the platform where personal storage was commonly thought to be, being 2m’s from ground level.     Measurements_MetresOur first attempt in Longhouse 3.0 had the bunk slats running the width of the platform in short 1.8-2.0 poles. Keeping in mind that pre-contact Iroquoian longhouse builders only had the use of stone axes and fire for initial harvesting of the trees, the notion that they would be chopping multiple platform poles into even length slats seemed like a considerable amount of work for relatively no benefit.  In F.W. Waugh’s Iroquois Foods and Food Preparation, he states:

A method described by David Jack was to ties some saplings around the tree, forming a small, scaffold-like structure. Sods were placed on this, water was poured over them and a fire built up below. By alternatively hacking with stone aces and burning, the tree was finally cut through. If it was desired to cut it into lengths, a double pile of sods was made around the trunk where it was to be divided , and fired applied to the space between. Chief Gibson’s description of tree-felling was essentially the same, except that, according to him, a quantity of rags was tied to the end of a pole and used for wetting the trunk and localizing the action of the fire. Both Lafitau and Kalm give similar descriptions, indicating the method to have been one in common use. *Lafitau, Moeurs des Sauvages Ameriquain, pt. 2, p.110 &  *Kalm, Travels, vol. II, p.38. (1916; p.8)

Thus, we made the decision that it was probably more efficient to harvest fewer but longer poles, which would act as the platforms for the bunk that would run horizontally along the length of the longhouse.

Also keeping in mind that poles were generally harvested around the 8-12m length and that White ash for sleeping benches were likely used.  White Ash tends to grow straight with very little branches and have consistent diameters even when it is long.  According to (http://www.na.fs.fed.us/pubs/silvics_manual/volume_2/fraxinus/americana.htm) a 20 year old White ash will generally be 4inch (10cm) in diameter and 12m in length.  So if we’re running a 24m long longhouse, we could have two 12m long 10cm diameter poles end to end for sleeping platform support beams. My estimate would be 16 beams (8 for each side of the sleeping platform).  The diameters had to be substantial enough to allow for at least 400-500lbs of weight (3-4 people) to be supported without buckling in the middle and long enough to be tied down on both ends and likely in the middle to the main structural elements.Double_supportsIn switching the direction of the poles however, it was quickly realized that there could have been a couple of additional enhancements to the bunking system to reinforce the poles and to deal with the weight of family members and their daily activities on the platforms.  Additional support poles were added at the major support posts (see above) and Craig suggested that it would have been better to tie down such long poles in the middle to keep them from shifting (see below).Middle_StrappingPosts (anything in the ground was Cedar) and beams (white ash) were tied together typically using basswood cordage (wood rope).   JV Wright supports this approach although we don’t have much visual or oral history to back it up.  Hitches or knots aren’t explained at all in the historical accounts, but this 1500’s image show a cross hitch/knot where the posts were lashed together (http://www.virtualjamestown.org/paspahegh/structure8.html).  We used a threaded looping knot and will use the cross hitch for the major support poles.

Another issue on our first try was the rounded look of the ends of the poles.  Obviously they wouldn’t have been uniformly rounded so we attempted to roughen up the ends of the poles a little more, but recognizing that over time and use, the ends themselves would be come rounded and dull.  There isn’t a lot of visual references available for wood cut by stone tools but Sensible Survival had a blog post on how to make a stone axe.  Below is an image from that blog posting which clearly demonstrates how rough the ends of a pole would be.12 tree cut 5Below is still frame from a Youtube video by freejutube, which shows a larger diameter tree that has been freshly cut by a stone axe.  As discussed above, the effort is extensive event to cut small diameter trees and the finished product is substantially rough in texture and feel.maxresdefault The image below has two end caps that haven’t been treated and the middle end caps have been modelled more to mimic the roughness.  A texture map will be applied to further enhance the visual look.LengthWise_Bunks_EndsAlthough we will talk further about these little details, a lot of this finite detail will be lost in the final gaming environment mainly because of lighting effects and the need to reduce the model complexity so the game runs in real-time.  However, seen or not, we are trying to logically address all of the visual elements that may be representative in this virtual reimagination of the archaeological record.

 

Another part of the last blog’s discussion was the notion of whether bark was removed from the support posts and bunking poles or whether it was left on.  This is obviously pure speculation because the oral, historical and archaeological records have no information on this or not. General consensus from the commentators was that removal of bark would have been preferred.  Completely by accident Dr. Jennifer Birch had suggested a great quasi-ethnographical account by F.W. Waugh entitled Iroquois Foods and Food Preparation written in 1916 (mentioned above) when I was starting to enquire about storage of food stuffs within longhouses.  In it, F.W. Waugh spoke extensively on the use of bark for a multitude of household and work related tools.  So much so that it seems impossible that Iroquoian longhouse builders wouldn’t have also harvested the bark for other needs prior to building the longhouse.  In the latest test below, among testing possible bedding, we ensured that the bark was either partially or almost entirely stripped from the poles.  In addition to the removal of the bark, the next step would be to add dirt, creosote, hand prints and other stains to the exposed wood to give the benches a looked in feeling.screenshot005Additionally, we started looking at what the potential bedding would be.  Again there isn’t much written on the subject, but everything from cedar boughs, woven mats to various furs were suggested.  Originally we thought Black Bear or Grey Wolf (current species that inhabit Southwestern Ontario) along with the common Deer, would be represented in the form of bedding.  However, the faunal (animal) remains within most archaeological sites near the Lawson Site area have limited or no Black Bear or Grey Wolf skeletal remains.  Deer, along with medium sized fur bearing animals such as Racoon, Rabbit and Beaver is much more representative .  The test image below shows a mixture of bear, wolf and deer.screenshot006Upon further discussion, we decided the next iteration to be a mixture of cedar boughs and primarily deer skin for bedding material.  As discussed above, the top level of the bunk may or may not have been used as a sleeping platform.  The historical references suggest that the smoke layer was somewhere in the 4ft-5ft level within a longhouse when all of the fires were going.  Ron Williamson reports from an experiment done at Ska-Nah-Dot in the middle of the winter during the 1970’s, that when a few warming and cooking fires were at full-capacity within the reconstructed longhouses, the smoke level was dense, leading to difficulty in breathing and to see.  I would speculate based on the references from the Jesuit Relations and Ron’s experience that the top bunk was used primarily for storage and thus for our next round of renderings, we’ll start placing household objects that might have been stored there.  screenshot007At this point, the next stages will be to add cubicle walls, the exterior walls, roofing, fire hearths and vestibules.  Again, there are several roofing methodologies and theories that can be visualized and easily reconstructed in 3D as we’ve seen in Longhouse 1.0 and Longhouse 2.0, however we will go with the Kapches model of bent wall poles that terminate at the roofs centre forming an arbour effect along the roof line.  Our decision will be discussed further in the next few posts, but for now we have provided one vision of how the initial internal structure may have been represented within Northern Iroquoian Longhouses of the 15th century.

Longhouse 2.1

Longhouse 2.1 was originally intended as a preliminary introduction to our 10 Loyalist College Animation interns to basic archaeological research and visualization of archaeological material.  As Sustainable Archaeology is located directly within the Museum of Ontario Archaeology, the students had direct exposure to the partially reconstructed Lawson Neutral Iroquoian Longhouse Village.

picture of longhouse

Additionally they were within driving distance to the Ska-Nah-Doht Village & Museum, a reconstructed Early Iroquoian Longhouse Village site which provided an excellent example of different architectural styles as well as interpretive visions.

Skanahdoht-Longhouse

The students had the opportunity to physically experience the reconstructed spaces, understand the materials used in the reconstruction and get a sense of the sound, light and atmospherics produced in such a building.

DSC_0200

Following traditional Film & TV methodology, the students used these physical references and the archaeological data from the Lawson site to start envisioning what a 3D representation of a Longhouse would look like.

longhouse_alanb

In representing what essentially was a reinterpretation of the archaeological data, the risk of this process is that there are multiple voices and competing visions as each artist, from the initial physical longhouse construction to the reimagined 3D representation is being played out visually.

longhouse_interior_light1

Yet an opportunity exists that the assets, what we like to call them in 3D lingo, are easily moved, reconfigured or even reinterpreted allowing for a more user centric approach.  Even in the two artists renderings above, little details like the direction of the support slats on the bench seating are different, each representing a different interpretation of the physical reconstruction of the longhouses visited.  Within 3D space, these tests can be played out with little effort, thus representing an opportunity for public stakeholders to engage with the archaeological record through their own perceptions.

Additional 3D models were made to represent the typical material potentially in daily use within and around a longhouse.  These assets then become props within the greater phenomenological experience, however through 3D scanning artifacts from the actual archaeological landscape can now inhabit the virtual archaeological landscape as well.

hanging_tobaccopotterycedartreecornstalks

Construction on the virtual longhouse became an interpretation of the existing physical reconstructed houses, the visual historical material and some archaeological data.  Again, the purpose was not to accurately recreate a longhouse per se, but to see what process these trained animators would use to reconstructed a longhouse within the 3D space.

LHSpin200

As the models began to materialize, the students started asking the same questions posed by Wright, Kapches and Snow.  Additionally, the challenges to model the objects in 3D also determine the visual outcomes or interpretation of the subject matter in question.

LHSpin452

Modeling within 3D space sometimes lacks the randomness that real life constantly provides.  Assets are replicated, such as the cedar shingles in the image above and thus, the interpretation looses some of the key features we would assume to be present in a typical longhouse construction.

LHSpin610

The final product, although representative of the subject matter, is in essence a copy of a copy.

 

This was a wonderful initial first run for the students and the archaeologists alike.  It provide a unique opportunity for the SA to see the production process from a traditional 3D animation methodology and it initiated the very same questions archaeologists would ask themselves when visualizing the archaeological record.  It also provided a jumping off spot to explore the necessity of having real-time, user defined and engaged content delivery systems.

The exercise provide the assets needed to continue the development process.  In Longhouse 2.2, we move into the real-time, user discovery environment.  It also represents a major pivot towards a sustainable and interactive approach to 3D visualization of archaeological material.

Missing the Point? It’s the experience Dummy!

The last two weeks I was busily developing and presenting a draft of my proposed Research Flow Chart.  My old age must be setting in, because I find it harder and harder to develop succinct research ideas!  In an attempt to make sense of what I am trying to accomplish, I drafted a short paragraph to flesh out the idea and then to act as a guide for my Flow Chart.

Visualizing Southwestern Ontario Socio-Cultural Implications

in Longhouse Morphology and Use

Understanding Longhouse morphology amongst the Southwestern Ontario archaeological landscape as it relates to extinct and descendent populations is problematic.  Historical accounts can be romanticized or even intentionally misleading while socio-cultural variation within homogeneous cultural groups varies wildly based on outside cultural influences, landscape as well as environmental resources and factors.  Visualization of these variable Longhouse features may provide a unique opportunity to engage all stakeholders (public, private, academic and descendent) in redefining what it means to live within a Longhouse community by experiencing it phenomenologically through the archaeological record.

My research will focus on engaging with the archaeological landscape by creating a 3D virtual tool-set specifically designed to allow stakeholders (public, private, academic and to use a procedural 3D model library in order to build in real-time within 3D space, interactive pre and post contact Longhouses of Southwestern Ontario.  Further, when deployed, stakeholders should be able to experience multiple senses of sound, lighting, environmental and atmospheric controls to focus on the association between the physical structure, spatial relationships and the phenomenological experiences of Longhouse landscapes.

The aim of my project is to develop a new way to engage with the archaeological landscape that will help to broaden our understanding of longhouse construction, community organization and external cultural and environmental influences with an eye towards challenging our current assumptions of longhouse communities within the archaeological record.

Visualizing Longhouse Morphology and Use
Visualizing Longhouse Morphology and Use

Combined with what I think is a good start to a traditional Research Flow Chart, I’m relying heavily on Landscape and Phenomenological Archaeology.  When I initially presented the concept, my colleagues became engaged when I started talking about having stake holders actually experience the environment virtually, but with the aid of sound and smell.  One colleague who has been a site interpreter for Sainte Marie Among the Hurons in Northern Ontario indicated that when school groups first enter their reconstructed longhouses, people stop in the doorway to adjust their eyes……I stopped myself to think, how can I created the same effect in 3D?  Then the museum also uses the smell of a fire burning in the hearth or sweetgrass smouldering along with the sounds of everyday life to bring the landscape to life!  These Phenomenological experiences, combined with visual elements like light, atmospherics (smoke, rain, snow, dust) and texture help to extend that experience.

Maybe it’s the experience that is more important than how one builds that experience?  Can that experience be reproduced repeatedly?  Should it?

It took a while, but I think “it’s the experience Dummy!“, that I’m finally catching onto.

Cheers,

Michael

EdgeLab!

Just got back from the Digital Media Zone open house! I’ll devote some more time to the DMZ later, but I ran into the EdgeLab which shares space at the DMZ.  They were making the coolest Social Innovation tools with Arduino that I’ve ever seen.  Interactive clothing with predefined words that can be emitted from a speaker or read off an LED.  Designed for children and adults with motor function difficulties, this is just an amazing use of Arduino and designed in such a way that this clothing can be specifically produced by anyone, anywhere!

I really encourage everyone to take a look at the website (http://edgelab.ryerson.ca)!

Twitter War Creative/Arduino Update!

(This Post is a class requirement for History 9832b Interactive Exhibit Design)

Now that we’re down to our last 2 weeks, the pressure is on!  I’ve run through multiple hacks, various examples and a couple of my own very poor attempts to code.  The archaeologist in me says that I’ve met my match when it comes to coding for Processing or Arduino!

This image is the property of Interactive MattersHowever, after doing another exhaustive round of internet searches, I came across this really fun example of how Twitter can talk to Arduino.  Created by Marcus Nowotny @ Interactive-Matter, the Twitballo0n 2 is an excellent approach to having Arduino respond to specific AND steady stream of Tweets!  In it’s basic form, a stream of Tweets with key words are analyzed and then converted into increment values in which a stepper drive turns raising or lowering a balloon on a string.  It’s really an elegant solution.

A brief email chat with Marcus who was kind enough to respond to my questions indicated that this solution with some modifications might work for my project.

On a different front, to clear my head of Proccessing and Arduino code, I jumped back into the display design with my old and trusted colleague Romelle Espiritu.  Romelle and I have been working together in the Digital Media, Film and Television industries for about 16 years now.  I asked him to help clean up my initial design which I’ll also use as a template to build a display board.

We once did a pitch to Osprey Publishing Men-at-Arms to create a TV series so I decided that stylistically we should follow the Osprey look to keep with the theme.  When I first came up with the idea, I immediately thought of an Osprey sponsored Internet based Flash version or even a full 12-14′ display at Niagara-on-the-Lake!

The next task is to send the image out to the printers to get a slightly larger version.  Several copies will be made to act as a template to cut out the support backing.  Given time constraints, I’m considering using a wood or foam core solution, but I would have like to have had Bill’s MakerBot replicate the pieces.

Given the difficulties I’ve had to get Twitter to run a flag up a pole, my fall-back position might be to use the natural querying process in Processing to drive a series of LED Red and Blue lights representing Pro-CAN or Pro-American Tweets in our Twitter War prototype.  That querying process along with a shorter delay on the Arduino LED code would give the effect of fireworks or explosions above the heads of my two soldiers.  A little like the Twitter Mood Hacks in which specific Twittered words represent specific LED colours.

Not that I want to admit defeat, but a simple solution might be the best approach!

Twitter to Arduino Hack!

(This Post is a class requirement for History 9832b Interactive Exhibit Design)

I have to say, this project has been a tough slog!  As discussed in previous blog’s, there are a lot of Twitter to Arduino hacks out there, but each has their own very specific approaches, which at times are hardware and/or software dependent.  Further I’ve learned that I’m a purely visual learner when it comes to physical objects or coding, which means I need to see someone do it first before I can really pick up on the process……..helpful for learning how to chop wood ; )

Luckily I ran into a great chap who teaches at Ryerson by the name of Nick Stedman (http://www.nickstedman.com/).  Oddly enough he was teaching a class last week in Arduino to Twitter through processing and invited me to sit along.  Below is what I think is a very useful approach to having Twitter control Arduino, but explained in a simple way.

So Nick had us work with a hack I had tried previously from Jer @ blprnt (http://blog.blprnt.com/blog/blprnt/updated-quick-tutorial-processing-twitter).  This one needs a user defined API from dev.twitter.com to generate “tokens” that will allow the Processing code to access Twitter more securely.  It also requires you to import the Twitter4J Core, which you can get here (http://twitter4j.org/en/index.html).  The part that Jer didn’t supply was the Arduino hack to read the Twitter feed from Processing and then to Arduino.

Processing Hack

So let’s start with Jer’s modified Processing code:

//Build an ArrayList to hold all of the words that we get from the imported tweets
//Needs SerialStandard for Arduino
ArrayList<String> words = new ArrayList();

import processing.serial.*;

Serial my_port;                          // Create object from Serial class
int rx_byte;                             // Variable for data received from the serial port

void setup() {
//Set the size of the stage, and the background to black.
size(200, 200);
background(0);
smooth();

String portName = Serial.list()[0];
println(Serial.list());
my_port = new Serial( this, portName, 9600 );

//Credentials – YOU NEED TO HAVE GENERATED TWITTER API TOKENS FIRST FOR THIS TO WORK –
ConfigurationBuilder cb = new ConfigurationBuilder();
cb.setOAuthConsumerKey(“YOUR TWITTER API CONSUMER KEY”);
cb.setOAuthConsumerSecret(“YOUR TWITTER API CONSUMER SECRET”);
cb.setOAuthAccessToken(“YOUR TWITTER API ACCESS TOKEN”);
cb.setOAuthAccessTokenSecret(“YOUR TWITTER TOKEN SECRET”);

//Make the twitter object and prepare the query – YOU NEED TO HAVE IMPORTED THE TWITTER 4J LIBRARIES FOR THIS TO WORK –
Twitter twitter = new TwitterFactory(cb.build()).getInstance();

  Query query = new Query(“Hi”);
  query.setRpp(10);

//Try making the query request.
try {

//Status status = twitter.updateStatus(“Processing to Arduino Now”); //message needs to change per tweet

QueryResult result = twitter.search(query);
ArrayList tweets = (ArrayList) result.getTweets();

for (int i = 0; i < tweets.size(); i++) {
Tweet t = (Tweet) tweets.get(i);
String user = t.getFromUser();
String msg = t.getText();
Date d = t.getCreatedAt();
println(“Tweet by ” + user + ” at ” + d + “: ” + msg);

//Break the tweet into words
String[] input = msg.split(” “);
for (int j = 0;  j < input.length; j++) {
//Put each word into the words ArrayList
words.add(input[j]);
}
};
}
catch (TwitterException te) {
println(“Couldn’t connect: ” + te);
};
}

void draw() {
//Draw a faint black rectangle over what is currently on the stage so it fades over time.
fill(0, 25);
rect(0, 0, width, height);

//Draw a word from the list of words that we’ve built
int k = (frameCount % words.size());
String word = words.get(k);

  if (word.equals(“Hi”) == true) {
    my_port.write(255);
    delay(4);    
    my_port.write(0);
  }

if (k == words.size()-1) {
    println(“new query”);
    delay(1000);

//Credentials – YOU NEED TO HAVE GENERATED TWITTER API TOKENS FIRST FOR THIS TO WORK –
ConfigurationBuilder cb = new ConfigurationBuilder();
cb.setOAuthConsumerKey(“YOUR TWITTER API CONSUMER KEY”);
cb.setOAuthConsumerSecret(“YOUR TWITTER API CONSUMER SECRET”);
cb.setOAuthAccessToken(“YOUR TWITTER API ACCESS TOKEN”);
cb.setOAuthAccessTokenSecret(“YOUR TWITTER TOKEN SECRET”);

//Make the twitter object and prepare the query
Twitter twitter = new TwitterFactory(cb.build()).getInstance();

Query query = new Query(“Hi”);
query.setRpp(10);

//Try making the query request.
try {

//Status status = twitter.updateStatus(“Processing to Arduino Now”); //message needs to change per tweet

QueryResult result = twitter.search(query);
ArrayList tweets = (ArrayList) result.getTweets();

for (int i = 0; i < tweets.size(); i++) {
Tweet t = (Tweet) tweets.get(i);
String user = t.getFromUser();
String msg = t.getText();
Date d = t.getCreatedAt();
println(“Tweet by ” + user + ” at ” + d + “: ” + msg);

//Break the tweet into words
String[] input = msg.split(” “);
for (int j = 0;  j < input.length; j++) {
//Put each word into the words ArrayList
words.add(input[j]);
}
};
}
catch (TwitterException te) {
println(“Couldn’t connect: ” + te);
};
}
}

With the Twitter 4J Libraries installed in your Processing Sketch you should be able to run this and get a constant print in the Sketch’s terminal.

This code searches for the query word and sets it’s query search to 10 returns or examples:

Query query = new Query(“Hi”);
  query.setRpp(10);

This code sends to the serial port that connects with Arduino, that if we find a query word, then == true, which then sends a value of 255 or fully “on” then delay by a value of 4 and turn off sent value.  Basically, if “Hi” is Tweeted, then send to the Arduino that value as positive and turn on the LED fully.  After 4 seconds, turn off the LED and wait for next value.

  if (word.equals(“Hi”) == true) {
    my_port.write(255);
    delay(4);    
    my_port.write(0);
  }

This code and everything repeated below it asks the Processing Sketch to do another query for “Hi” constantly.

if (k == words.size()-1) {
    println(“new query”);
    delay(1000);

The Hack that Nick suggested is that to initialize the query, we have to set it first, return the initial value and then the same code has to be added again to ensure that the query runs continuously looking for the value or query word “Hi”.

Arduino Sketch

Additionally to get the Arduino LED to light, you need the Arduino Sketch, which was the missing piece in the Jer example above.

// Very basic program to try out serial communication.
// Checks for data on the serial port and dims an LED proportionally.
// Then reads a sensor, and transmits the value.
// NB. Serial is limited to one byte per packet, so constrain the data you communicate to 0-255.

int led_pin = 9;                           // use “led_pin” to reference pin #
int rx_byte;                             // a variable for receiving data
int sense;                               // a variable for storing sensor data
void setup()
{
Serial.begin( 9600 );                  // start serial port at this speed (match with other software eg. MAX, Processing)
pinMode( led_pin, OUTPUT );                  // make pin an output – connect to LED (remember to use => 220ohm resistor)
}
void loop()
{
if( Serial.available() > 0 ) {         // if we receive a byte:
rx_byte = Serial.read();             //   store it,
analogWrite( led_pin, rx_byte );      //   and dim LED according to its value
}
sense = analogRead( 0 );               // read the sensor – returns 0 to 1023
sense = map( sense, 0,1023, 0,255 );   // adjust values to transmit – scale to 0 to 255 (…dividing sense by 4 would do the same)
Serial.write( sense );    // send the sensor data
//  Serial.write( sense );        // use this command instead for new Arduino version
delay(10);
}

This Arduino Sketch is really just reading data from Processing and sending it to the Arduino unit to turn the LED off and on based on how many times the Twitter keyword is found.

Conclusions

Here is yet another example of how to extract data from Twitter.  Like the post previously we now have two methods of accessing Twitter through Processing.  The first method is a straight query, using your Twitter Log-in and Password.  The second as described above, increases the security to your Twitter access by using the Twitter API function to generate secure tokens.

Although Nick’s method is a great first step, we still need to regulate how the Twitter query feeds into Processing and then Arduino.  Right now it’s a massive dump of info.  With the additional code to repeat the query, we’re still getting the same results + new results in every query, so we need to ensure that for every one tweet, it only returns it’s value once to Arduino.  Then we can use that one value return to inch our step motor and flag up the pole.

It Works………Kind of!

 

Okay, it’s been several weeks now.  I’ve tried many Twitter to Arduino and Twitter to Processing Sketches, but the best one is this one I found is this!

Twitter to Processing Sketch:

//http://blog.blprnt.com/blog/blprnt/quick-tutorial-twitter-processing

Twitter myTwitter;

void setup() {
myTwitter = new Twitter("yourTwitterUserName", "yourTwitterPassword");
try {

Query query = new Query(“sandwich”);
query.setRpp(100);
QueryResult result = myTwitter.search(query);

ArrayList tweets = (ArrayList) result.getTweets();

for (int i = 0; i < tweets.size(); i++) {
Tweet t = (Tweet) tweets.get(i);
String user = t.getFromUser();
String msg = t.getText();
Date d = t.getCreatedAt();
println(“Tweet by ” + user + ” at ” + d + “: ” + msg);
};

}
catch (TwitterException te) {
println(“Couldn’t connect: ” + te);
};
};

void draw() {

};

This is a simple, elegant and very easy to setup Processing Sketch.  I’ve tested it with “Twitter1812” and it works perfectly.  Now, my assumption is that if this sketch can query or respond to specific word queries, we should be able to get it to seek two variables such as; CAN1812 or USA1812.  Before I can get there however, I want to make the output of this Processing sketch, into a function that will then turn on an LED through a Processing command.

I found a great Processing to Arduino Sketch, but unfortunately I lost the URL from the original site, so I apologize if I’m not recognizing the original creator.  Again, this is a very simple set of Sketches.  The first is the Arduino and the second is the Processing.  Basically Processing accesses Arduino through a dedicated serial port.  Bill’s introduction to Firmata last week was to hopefully bypass the Arduino code bit, but like many of my classmates, the libraries weren’t working on my operating system.

Here is the Processing Sketch:

import processing.serial.*; //This allows us to use serial objects

Serial port; // Create object from Serial class
int val; // Data received from the serial port

void setup()
{
size(200, 200);
println(Serial.list()); //This shows the various serial port options
String portName = Serial.list()[1]; //The serial port should match the one the Arduino is hooked to
port = new Serial(this, portName, 9600); //Establish the connection rate
}

void draw()
{
background(255);
if (mouseOverRect() == true)
{ // If mouse is over square,
fill(150); // change color and
port.write(‘H’); // send an H to indicate mouse is over square
}
else
{ // If mouse is not over square,
fill(0); // change color and
port.write(‘L’); // send an L otherwise
}
rect(50, 50, 100, 100); // Draw a square
}

boolean mouseOverRect()
{ // Test if mouse is over square
return ((mouseX >= 50) && (mouseX <= 150) && (mouseY >= 50) && (mouseY <= 150));
}

And here is the Arduino Sketch:

const int ledPin = 13; // the pin that the LED is attached to – change this if you have a separate LED connected to another pin
int incomingByte;      // a variable to read incoming serial data into

void setup() {
// initialize serial communication:
Serial.begin(9600);
// initialize the LED pin as an output:
pinMode(ledPin, OUTPUT);
}

void loop() {
// see if there’s incoming serial data:
if (Serial.available() > 0) {
// read the oldest byte in the serial buffer:
incomingByte = Serial.read();
// if it’s a capital H (ASCII 72), turn on the LED:
if (incomingByte == ‘H’) {
digitalWrite(ledPin, HIGH);
}
// if it’s an L (ASCII 76) turn off the LED:
if (incomingByte == ‘L’) {
digitalWrite(ledPin, LOW);
}
}
}

Processing creates an interactive box, which when the mouse rolls over it, the Arduino LED turns on.  So, my basic assumption is this.  If Processing can receive an input from Twitter, it can write that input out as a function that can then turn an LED light off and on within Arduino.  If we can accomplish that task, switching the LED for a Motor Shield to power our flag gear and pulleys should be easy!

If anybody has any suggestions I’m all ears!

 

Twitter to Arduino……..Everything but the Flashing LED!

(This Post is a class requirement for History 9832b Interactive Exhibit Design)

How simple can it be?  Get Twitter to send a command to my Arduino and turn an LED on.  For the life of me I can find every inconceivable Twitter to Arduino combination from Twitter controlled toilets to Twitter notification for Kitty.  But find a simple Arduino or Processing Sketch that can turn an LED on from Twitter, I can’t!

In order for Arduino to receive commands from Twitter it has to be connected to a computer with an internet connection or it has to have an Ethernet Shield or a WiFi Sheild.  I’m going to opt to run it through the computer for simplicity sake.  Doing so requires me to run a Processing script to receive Twitter commands and then process those commands to Arduino to turn on a light or do some other function.

Although there was plenty of chatter about limits to how many times Twitter can be accesses in an hour, apparently this limit has been taken off of Twitter, so now no additional code is required within Processing to overcome this.  Additionally, Processing needs to be authorized to access or receive from Twitter, so another step is to setup an API account on Twitter to do so.

After three days of this I’m thinking that instead of Twitter, we’ll just do this the old fashion way and have two contestants pound a button as fast as they can so that Arduino can receive the commands physically to raise the flag!

My plan of action is this now:

  1. Get Processing to turn an LED off and on.
  2. Get Twitter to go through Processing to turn an LED off and on.
  3. Get Arduino to turn a gear (which will later hoist a flag) through a Twitter command.

I’ll be lucky at this point to accomplish this!  And no, I don’t want to see another Sketch to turn my toilet seat into a Twitter notification!

Vectors & 3D Animation!

Last week in Bill’s 9832b class, we spent some time understanding how to visualize within a Vector environment.  I can’t believe that I’ve been working with Adobe Illustrator, a 2D Vector graphic design application, for almost 18 years!  Even with all of those years, I learn something new every time.  As you can see in my post Flag Widget! – Part I I used Illustrator to trace out an object into a vector file format that could be read by a 3D Printer or other physical printing device.  By no means am I an expert and readily rely on Romelle Espiritu, friend and colleague to do any of the heavy Illustrator “lifting” on major projects.  However, Adobe Photoshop, Illustrator and Acrobat are the three mostly widely tools in my graphic arsenal.

This weeks post is really a little of a history lesson on 3D Animation in Canada.  Let me explain first by talking a little about the rich and deep history of 3D Animation in Canada.  I also want to say something about nomenclature.  Since the release of Avatar a few years ago, the use of the term “3D” in the general public means steroscopic movies, or in general terms, watching a movie with those funny glasses.  In the business we call this “S3D”.  When I talk about 3D Animation, I’m using the traditional industry term for Computer Graphic Images (CGI) or 3D Animation.

Most Canadians and generally most industry professionals outside of Canada don’t realize that it was two Canadian NRC physicists who invented the basis for computer animation, namely “Computerized Key Frame Animation”.  In the late 60’s Marceli Wein and Nestor Burtnyk while on loan to the NFB researched and created the first vector based computer animated film; La Faim.

In order for Wein and Burtnyk to create the technology to produce the film, they devised several very interesting tools for the time; real-time rendering to film, a wooden mouse to interface with the vector artwork and of course the ability to create computerized keyframes, as done in traditional animation at studios like Walt Disney and Warner Bros.

In 1997, the Academy of Motion Picture and Arts awarded Wein and Burtnyk with a Technical Oscar, recognizing  their contributions to the film industry.  Below is a NFB documentary on their technical process for computer animation.

After La Faim, an explosion of industry firsts propelled Canada front and centre on the world stage as the preeminent country to produce Animation of any kind.  Sheridan College became the first School in the world to offer Animation as a College Program.  Nelvana was founded and became the largest non-film animation studio outside of California but most importantly three Canadian technology companies; Softimage (first use of inverse kinematics for character animation), Alias (first use of nurbs modeling software) and Side Effects Software (inventor of procedural animation) exploded on the scene with computer animation tools which at the time allowed traditional artists to actually use the technology……although somewhat still dependent on an army of technical gurus, in a truly artistic way.

I graduated from Sheridan College in Computer Animation in 1996.  Immediately I went to work for Side Effects Software as a demo artist and was proudly employee number “34”.  In the following three years, Side Effects sent me around the world several times to meet studios, demonstrate the software to far superior professionals in the industry than I and to understand the nature of 3D animation production, pipelines and film making.  Along the way I’ve worked with Kim Davidson, Greg Hermanovic, Henry LaBounta, Sean Lewkiw, Katsuhiro Otomo, Ken Perlin, Nick Park, David Sproxton and a host of memorable mentors, friends and associates.  I’ve been interviewed on Japanese television debating the fine points of a CGI Godzilla versus a “guy in a rubber suit” and given live demonstrations to thousands of people at a time.  In May of 2000, I personally animated/modeled/rendered my last file.  Since that time, my role in the industry has been developing and implementing animation pipelines, building studios, Executive Producing series, films and various projects.  So, although I’m the “3D Guy”, I haven’t actually animated, modelled, light or rendered ever since.

It should be interesting what I can come up with for this weeks class assignment to work in Google Sketch, Blender or any other new 3D software : )

Johnny Thunder!

Johnny Thunder was an interesting development project at theskonkworks/Calibre Digital Pictures.  Originally conceived as a pitch to get Bob Thompson from Lego Media (Co-Creator of the Bionicle Brand and Executive Producer of the first three Bionicle movies) interested in further expanding Lego’s film & tv universe.   I had recently returned from Aardman Animations in the UK to become Director of Development and then Executive Producer at Calibre Digital Pictures when it was decided that we’d try and track down Bob to get an “audience” at the upcoming Kidscreen in New York (where all of the TV buyers go to sell and buy kids based entertainment).  After a couple of weeks of pulling strings and asking for favours, I got the elusive email and telephone number of Bob and gave him a ring in the UK to pitch the concept of “Johnny Thunder” as a TV series.  I remember it as being 2 weeks before Christmas.

Bob was receptive in that very reserved British way and he agreed to meet in February at Kidscreen to see a “test”, which of course we hadn’t started making yet.  After the phone call, I sat in my office thinking, “holy shit” we’re going to have to make a short in less than 6 weeks!  Mobilizing our amazing team, who were also producing Ace Lightning season 2, Shoebox Zoo, and Henry’s World, we put together a team of about 30 people who volunteered to work extra time to design, develop and finish the test.

With days to spare before the big meeting, everything was in place.  A custom “Johnny Thunder” leather pouch with pitch materials, DVD’s of the short and my flight tickets to New York for the big presentation.  Here’s the original pitch.