Techical

Learning New Softwares: After Effects and Premiere Pro

aftereffectsevid

By using YouTube tutorials and forums I self taught myself Adobe After Effects. Some of the skills I have developed are layering video and still image, layer masks, inserting shapes and animates, text, enhancing video footage, key framing etc etc the list is pretty endless. I would edit together short sections of footage, export them from AE and import them into Adobe Premiere Pro in order to piece together longer reels of video. This process was very very time consuming!! A process which I had allocated a lot of time for however it seemed to take forever! I quickly learnt the importance of setting up a project correctly and ensuring the auto save functionality is well and truly switched on after losing many hours of work right at the beginning. So mix a loss of work, a longer process than first assumed and a gazillion other things to worry about… this was beginning to take its toll and the stress level was rising.

prempro

 

Although after effects has an audio functionality, I found the audio much easier to work with once the clip was in Premiere Pro. I think this would be an area I would wish to improve on because the software is so clever but because I was teaching myself I never quite managed to learn everything. I tried to complete each video to professional and polished final quality because these videos were the heart and sole of the project – they needed to be good. By incorporating what I have researched about sound recognition and emotive responses caused by certain sounds, they also needed to be included carefully and purposefully.

Interactivity Code Elements: Problems

Cross Browser Compatibility. 

This was my problem. It is so easy to get stuck in your own world and your own laptop and forget that not everyone uses the same things as you do.

To overcome problems faced, which was both installations failing to work in Safari we had to change our plan of action.

For the Twitter speaking page, we used a plugin called myspeak.js.  The reasons we used this plug in was because it is a universal platform using only Javascript, which meant it did not require detecting the user agent and would have cross browser compatibility.

For the webcam plugin we were initially using a Javascript only plugin, however this did not allow for the webcam function to be loaded in Safari. To overcome this we instead used a Flash plugin that once set up was much simpler to use and didn’t alter the overall look or feel of the plugin. The “allow” or “deny” pop up was not something I had anticipated and did ruin the element of surprise that I wanted to add the piece. However sometimes these things just cannot be helped. I think for the viewers pieces of mind and privacy it is very relevant and additionally a disclaimer must be somewhere to ensure they are not worried than the footage is being recorded or stored anywhere!!

Interactivity Code Elements

So my portfolio of artistically visual  engaging pieces is starting to take shape however I want to take it to the next step. I want to put the user in the piece and I have chosen to do this in 2 different ways.

  • Using the webcam

Analysing peoples pictures was hot topic of discussion throughout the research process with people being openly admitting to judging other peoples pictures and then having to almost remind themselves that other people would be doing the same about them. By using a webcam plugin I aim to pull the live image of whoever is engaging with the piece and making them believe that their face is in a live social media image environment. By mocking up a page whereby a live feed of comments of what people are actually usually thinking when they see their image upload online, this will demonstrate not only what people are really thinking when they see pictures or upload pictures themselves, but to make to the viewer feel uncomfortable and to poke fun at the fact that they would not truly enjoy a life online quite so much if they really did feel as exposed in real life and not just through images and status posts.

The design aspect of incorporating a live webcam image is relatively easy to mock up, with the webcam image box imbedded in a page and then the surrounding image can be designed as a Facebook photo page.

  • Using live Twitter stream

This project is all about the blurring lines between social media online and off… so since Twitter has often been criticised for people just “shouting” about stuff one really cares about, I want to turn this on its head. I want Tweets to be read aloud as if they were being spoken. This will be representative of how people wouldn’t really behave on Twitter how they would in real life and again poke fun at the fact that we engage with social media at all. Since research has suggested that “hashtagging” is the ‘done thing’ and that people do it either because other people do or to be funny, the functionality of this interactive section will occur from people searching a word and then a tweet with a hashtag of that word will be read aloud. The functionality of this will be quite complex, especially if I want it to be live. It will require registering as a Twitter app developer and getting an API key.

Having openly admitted that code is not my strong point I have located help from Dominic White, a Ba (Hons) Computing final year student at Bournemouth University. By working with somebody outside of the course and building cross course collaborations I feel I have improved my communication skills and furthered my computer knowledge more generally. Dom has acted similar to that of a tutor which I feel has benefited my personal development more so than  a straight swap collaboration would have done because I have seen the code parts grow and form from the very beginning. By working through the coding aspects of both these interactive elements, I now have better under Twitter API’s, Jquery, Javascript and  how plugins work.

 

MAJOR PERSONAL ACHIEVEMENT *round of applause please*

In a time as stressful as this, it’s important to emphasize the small things in life.

After working really hard in trying to develop my personal skills in terms of HTML, CSS, PHP and Java I have managed to code a mobile responsive website template complete with drop down navigation bars and everything. Okay Okay so I am aware of how simple this sounds but for me its a huge achievment and means I can get underway with coding the website portfolio to host my mini series of “The Hashtag Life”.

Last week I have been in talks with fellow course member and Leeroy, as well as a post grad student from Plymouth Uni, Chris, who have led me on the way to coding my Twitter in real life installations where tweets are read allowed after the user searches for whatever term they like.

 

Reiteration of the project idea:

For my graduate project, The Hashtag Life, I am creating an interactive online portfolio of digital art. The projects aim is to make the users/audience question their own usage of social media and the impact it is having in terms of real life interaction. For a small section of the portfolio I am looking for somebody to help me code a “Twitter in real life” experience. I basically want users to be able to search for a hashtag word and for the most recent tweets to be played back to them via spoken sound as if the tweets were being spoken allowed. This is to show how bizarre tweets would sound if they were actually spoken in day to day conversation.

I think I need to use the twitter API to store tweets in a MySQL database, then build a search so the users can search words. By capturing the results in an array in JavaScript or jquery and then pass the array of tweets from the search results through a text to speech API (all of which using Ajax to make it asynchronous)…. I hope this makes sense and there may be a better way to do it?

Addtionally, a webcam plugin where the the viewer is put in the experience…perhaps in a profile picture scenario…

 

 

Production Analysis: Technical

carousel-gif

The technical aspect of my project requires me to create a totally immersive experience for the user. I still want to tie in my very very initial idea that I pitched back in November and this means utilising social media channels to distribute my project…..

This is why I am creating a series of videos, all linked together in a network, the network has different path options determined by the user. All of the video will be hosted on the YouTube channel I have specifically set up for this piece of work. The individual videos will have links at the end of each one for the user to decide what path to take through the experience and what video they will watch next by clicking on the link.

The video will also be uploaded to The Hashtag Life portfolio website. http://www.untagme.co.uk where there will be some background information and some fun codey stuff  (yes I am attempting tricky codey stuff! I’ve even learnt what API means and everything, plus I am registered as a Twitter developer so thats got to mean something! 🙂 …) . This will involve imaging tweets out loud for the users to engage to see just how bizarre twitter in real life would actually be.

This in turn not only will this improve my HTML, CSS, PHP and Java knowledge and understanding immensely, but will also allow me to focus on my main interest for the future which is to work with video design and photography. This project will allow me to polish my expertise in Adobe After Effects, Premiere Pro, Photoshop and Illustrator, all of which are going to be useful for career choices after uni.

Online meets Offline: What if we could hear tweets/posts?

There is an ongoing debate that social media is becoming more and more anti-social. With people glued to their smartphones and being contactable 24/7. People are so caught up in having a relationship with with their social media outlets, they are letting the real world bypass them.

These clips taken from Bruce Almighty (2003), directed by Tom Shadyc, distributed by Universal Pictures demonstrate feature Bruce, played by Jim Carey, struggling to cope with the amount of prayers he can hear. In the film, Bruce has been given the powers of being God, by the man himself, played by Morgan Freeman, which is why he can now hear peoples prayers within a 50 mile radius. Prayers are usually thoughts or private mutterings and meant for only god himself. This got me thinking, what if we could hear tweets, or posts or messages or emails. This is reflective of my tutor’s analogy of Twitter – that its just a bunch of people acting like seagulls standing on a rock and shouting and that the words don’t mean anything and they just end up getting blurred together for mass viewing but are not for communicative purposes.

(Clip taken from Finding Nemo (2003), directed by Andy Stanton, produced by Walt Disney Pictures)

…back to Bruce Almighty.

This clip demonstrates how social uncomfortable hearing the prayers is making Bruce. He cannot cope with the situation he is in and needs to take a break…this could be something we have all experienced where we have become to indulgent in our social media behaviour, particularly amongst procrastinating students who will often say “I am so bored of distracting myself on Facebook”.

This clip was perfect for demonstrating just how ridiculous the world would be if tweets or posts or whatever anyone shares online was actually a physical object in a physical space in an offline world. The sheer quantity is ridiculous. What I like about this however, is how it makes you question the importance of what people post… would people post online as much as they do if what they were posting was an actual object? For example, Bournemouth University, if they actually had a physical wall to represent a Facebook wall – how many people would actually write something on it for the rest of the study body to see? Again, this highlights how silly sharing online would actually seem to be if it were in the real world. For example, you would not walk down the street in the morning and shout to the nearest stranger “I HAD PORRIDGE THIS MORNING FOR BREAKFAST AND IT WAS FANTASTIC #readyfortheday “.

Going back to the idea of a joiner image building a physical picture of an environment whereby it was represented as how it was talked about online, instead of just the physical object that it is, is an exciting idea to be moving forward with.

Offline Environment Online

maps-breakdown

Seeing something from all angles. There is a whole hidden story behind the world in which we see in front of us.

Inspired by the works of David Hockney, the interactive joiner image of an offline and online world could be interesting to explore. I think the strong image shown by still joiner images really gets across how looking at the same object but from lots of different angles gives a totally different impression of such object. This offers further understanding as to what is meant when I discuss how people respond and react differently to what other individuals post online, i.e. preconceived ideas about traditional online protocols. For example, the selfie – people who take selfies are commonly described as vain or attention seeking but in reality they are not like that at all.

Some joiner image examples:

hockney-furstenberg-paris image2109 photo_joiner___my_room_by_wardy360

Augmented Reality App

zappar Introducing Zappar, an app that could revolutionise static print material… This app scans a still print image and reveals a video… effectively making the video come to life. This could be interesting with regards to my story… an image that when scanned would reveal part of the story perhaps? An issue of this would be people not having the app and not wishing to download it in order to participate? This would result in lower levels of audience involvement, however it could be a good way to market and promote people to get involved initially. Zapper To use Zappar, I would have to utilise the free 30 day trial in order to create my own Zaps. This means that I would have to correctly time everything to fit in the time frame of creating and broadcasting zaps.