Amazon Offers Animators A New Way to Make Money from Online Video

In a major development for independent content creators, Amazon is inviting all creators to post videos to its Amazon Video service and earn royalties based on hours streamed. The service went live today.

Amazon Video Direct (AVD) is a self-service platform that puts the e-commerce giant in direct competition with video sites like Youtube and Vimeo. AVD makes video content available for free to all Amazon Prime members, who number in the “tens of millions”, and pays creators based on the amount of time the content is streamed.

In addition to streaming, AVD provides other options to creators: content can be made available to rent or own, viewable for free with advertisements, or packaged together and offered as an add-on subscription (through the Streaming Partners program).

“It’s an amazing time to be a content creator,” said Jim Freeman, vice president of Amazon Video, in a a press release. “There are more options for distribution than ever before and with Amazon Video Direct, for the first time, there’s a self-service option for video providers to get their content into a premium streaming subscription service. We’re excited to make it even easier for content creators to find an audience, and for that audience to find great content.”

Launching alongside this new system is the AVD Stars Program, which will distribute $1 million every month among the 100 most popular AVD titles in Prime Video. The monthly bonus is in addition to any other revenue earned through the program, and everyone who uses AVD is automatically enrolled to participate.

AVD launch partners include a mix of both established media brands and online media producers, among them Conde Nast Entertainment, HowStuffWorks, Samuel Goldwyn Films, The Guardian, Mashable, Mattel, StyleHaul, Kin Community, Jash, Business Insider, Machinima, TYT Network, Baby Einstein, CJ Entertainment America, Xive TV, Synergetic Distribution, Kino Nation, Journeyman Pictures, and Pro Guitar Lessons.

Creators who want more information on signing up for Amazon Video Direct can visit Amazon’s website.

Watch Lytro Change Cinematography Forever

The Lytro Cinema Camera could be the most groundbreaking development in cinematography since color motion picture film.

Since Lytro first teased their Cinema Camera earlier this month, articles have been written. Press conferences were held. Lytro’s presentation at NAB 2016 was standing room only and hundreds were turned away. Several press outlets did write-ups of the demo; we’ve been writing about the technology concept for five years. But words don’t do it justice: you have to see the new Lytro cinema system in action, including its applications in post-production, to understand just how much this technology changes cinematography forever.

On its own, it would be a supreme technical accomplishment to develop a 755 megapixel sensor that shoots at 300 frames per second with 16 stops of dynamic range (for reference, the latest ARRI and RED cinema cameras top out at 19 and 35 megapixels, respectively). But those outlandish traditional specifications might be the least interesting thing about the Lytro Cinema Camera. And that’s saying something, when developing the highest resolution video sensor in history isn’t the headline.

The headline, as Jon Karafin, Head of Light Field Video at Lytro, explains, is that Lytro captures “a digital holographic representation of what’s in front of the camera.” Cinematography traditionally “bakes in” decisions like shutter speed, frame rate, lens choice, and focal point. The image is “flattened.” By capturing depth information, Lytro is essentially turning a live action scene into a pseudo-CGI scene, giving the cinematographer and director control over all of those elements after the fact.

The technique, which is known as light field photography, is not simply enabling control ex post facto over shutter speed or frame rate: the implications for visual effects are huge. You can “slice” a live scene by its different “layers.” Every shot is now a green screen shot. But it’s not an effect, per se; as Karafin notes, “it’s not a simulation. It’s not a depth effect. It’s actually re-ray-tracing every ray of light into space.”

To fully understand the implications of light field photography requires a lengthy video demo… so we released one. Karafin gave us a live demonstration of “things that are not possible with traditional optics” as well as new paradigms like “volumetric flow vectors.” You can tell the demo is live because the CPU starts heating up and you can hear the fans ramp up in our video… and the computer was on the other side of the room.

 

Visual effects implications

If light field photography fails to revolutionize cinematography, it will almost certainly revolutionize visual effects. Current visual effects supervisors tag tracking markers onto various parts of a scene—think the marks you often see on a green screen cyc—so that they can interpret a camera’s movement in post. The Lytro camera actually knows not only where it is in relation to the elements of a scene—because of its understanding of depth—but where the subjects are in relation to the background (and where everything is in between). This depth data is made available to the visual effects artists, in turn making their integration of CGI elements much more organic because now everything in the scene has coordinates on the Z-axis. They’re not matteing out a live-action person to mask out a CGI element; with Lytro they are actually placing the CGI element behind the person in the scene. And green screening takes on a new meaning; as you can see in the demo, it’s no longer chroma- or luminance-based keying, but is instead true “depth-screening.” You can cut out a layer of video (and dive into more complex estimations for things like hair, netting, etc). With Lytro, you don’t need a particular color backdrop to separate the subject; now can you simply just separate the subject based on their distance from other objects.

Every film is now (or could be) Stereoscopic 3D

Stereoscopic 3D is another matter entirely. To shoot in 3D currently involves strapping two cameras together in an elaborate rig, with a stereographer setting and adjusting the interaxial separation for every shot (or doing a 3D “conversion” in post, with artists cutting out different layers and creating parallax effects manually, which yields inferior results). The Lytro camera, because it has millions of micro lenses scanning a scene from variegated perspectives, can do 3D in post without it being a simulation. You don’t need to shoot on two cameras for a 3D version and then just use the left or right camera for the 2D version. With Lytro you can set the parallax of every shot individually, choose the exact “angle” within the “frustrum” you want for the 2D version, and even output a HFR version for 3D and a 24P version for 2D—with the motion blur and shutter speed of each being “optically perfect” as Karafin notes. Even if you shoot your film on a Lytro only with 2D in mind, if advances in 3D display technology later change your mind (glasses-less 3D anyone?), you could “remaster” a 3D version that doesn’t have any of the artifacts of a typical 3D conversion. With Lytro you’re gathering the maximum amount of data independent of the release format and future-proofing your film for further advances in display technology (more on this in a bit).

What light field photography doesn’t change

The art of cinematography—or as many of its best practitioners deem it, a “craft” as opposed to an art—is not limited to camera and lens choices. Cinematography is often referred to as “painting with light,” and the lighting of a scene, with or without a Lytro camera, is still the primary task. While Lytro is capturing depth info, to my understanding the actual quality and angle of light is being interpreted and captured but is not wholly changeable in post (it’s also possible that, if it is, it simply went over my head). As Karafin notes, Lytro’s goal with the cinema camera (as opposed to their Immerge VR technology, which allows a wholly moveable perspective for virtual reality applications) is to preserve the creative intent of the filmmakers. This means your lighting choices and your placement of the camera are, for the most part, preserved (the current version of the cinema camera can be “moved” in post by only 100mm, or about 4 inches). As a director you are still responsible for the blocking and performances. As a cinematographer you are still responsible for all of the choices involved in lighting and camera movement.

This is part of a continuum. As cinematography has transitioned into digital capture, it has in many ways become more of an acquisition process more heavily involving decisions made in post. The Digital Intermediate gives the colorist a greater amount of control than ever before. Cinematographer Elliot Davis (Out of Sight, The Birth of a Nation) recently told NFS if he could only have one tool besides the camera, it would be the D.I., not lights or any on-set device. Lytro is maximizing the control filmmakers have in post-production but it is not actually “liberating your shots from on-set decisions” (our fault for not fully understanding the technology, pre-demo).

 1

The short test film Life

I can already hear people reacting to some of the shots in the demo with “but this effect looks cheesy” or “that didn’t look realistic.” The same can be said for any technique used by inexperienced hands; think of all the garish HDR photographs you’ve seen out there. With those photographs, HDR itself isn’t the issue, it’s the person wielding the tool. And in the case of Lytro, there is no such thing as an experienced user. The short film Life will be the first. Even in the hands of experienced Academy Award winners like Robert Stromberg, DGA and David Stump, ASC (along with The Virtual Reality Company) the Lytro Cinema Camera is still a brand new tool where the techniques and capabilities are unknown.

The success or failure of Lytro as a cinema acquisition device has little to do with how Life turns out, given the technological implications extend far beyond one short film.

Cameras aren’t done

On our wrap-up podcast from NAB, my co-host Micah Van Hove posited that “cameras are done.”  His thesis was that the latest video cameras have reached a rough feature parity when measured by traditional metrics like resolution, dynamic range, frame rates, and codecs. We deemed it a “comforting” NAB because as filmmakers there wasn’t something new we had to worry about that was going to make our current cameras obsolete.

And then the next day I went to the Lytro demo. So much for “nothing new.” And when it comes to making current cameras obsolete… you can see a scenario where the future, as envisioned by Lytro, is one of light field photography being ubiquitous. Remember when you distinguished a camera phone as being different from a smart phone (as being different than a “regular” phone)? Now that the technology is mature they’re just “phones” again. With handheld mobile devices, features like capturing images and the ability to use the internet have become part of what’s considered “standard.” Similarly, maybe light field photography will just be called “photography” one day. Is it only a matter of time until all cinematographers are working with light fields?

Democratization of the technology

Right now the tools and storage necessary to process all of this data are enterprise-only. The camera is the size of a naval artillery gun. It is tethered to a full rack of servers. But with Moore’s law and rapid gains in cloud computing, Lytro believes the technology will come down in size to the point where, as Karafin says, it reaches a “traditional cinema form factor.”

If Lytro can also get the price down, you can see a scenario where on an indie film—where time is even shorter than on a larger studio film—the ability to “fix” a focus pull or stabilize a camera without any corresponding loss in quality would be highly desirable. That goes for documentary work as well—take our filming of this demo, for example. The NFS crew filmed what will end up being over a hundred videos in 3.5 days at NAB. To take advantage of the opportunity to film this demo we had to come as we were—the tripod was back in the hotel (the NAB convention center is 2,000,000 square feet so when you’re traversing the floor, you are traveling light). As a result, I’m sure viewers of our video demo above will notice some stabilization artifacts on our handheld camera. Had we captured the demo using light field technology, that stabilization could be artifact-free.

In film making, for every person working in real-world circumstances, where time and money are short and Murphy’s Law is always in effect, there are ten people uninvolved in the production who are quick to chime in from the sidelines with, “that’s not the way you’re supposed to do it.” But experienced filmmakers know that no matter how large your budget or how long your shooting schedule, there is no such thing as an ideal circumstance. You try to get it right on the day, with the personnel and the equipment you have on hand, but you are always cleaning up something after the fact. Even Fincher reframes in post. Lytro allows for much more than reframing.

The ability to “fix everything in post” is surely not the primary goal of Lytro (see “storytelling implications” below), but it’s an undeniable offshoot of capturing all of this data. And should the technology be sufficiently democratized, it would be enabling for all levels of production.

For anyone opposed to this greater amount of control, let me ask: do you shoot with LUTs today? If so, you are using a Look Up Table to help the director and cinematographer see their desired look on the monitors during the shoot, but that look is not “baked in.” In post, you still have all of the RAW data to go back to. In a way, to argue against Lytro’s ability to gather the maximum amount of data and retain maximum flexibility in post would also be to argue against something like shooting RAW (not to mention that celluloid could be developed in any number of ways… which always took place after shooting).

2

Corresponding advances in display technologies

Imaging isn’t going to advance only from an acquisition standpoint; there will be a corresponding advance in display technologies. Taking all of this information and displaying it on a flatscreen monitor (even a conspicuously large one) feels like a compromise. Lytro surely isn’t aiming to bring the best-looking images to the existing ecosystem of 50″ flat screens; they must be thinking way beyond that.

Imagine your entire wall is a display, not just a constrained rectangle: you’re going to want that extra resolution, going well past 4K. Imagine you can view material in 3D without glasses or headaches: you’re going to want the depth information and control. Imagine you’re wearing a headset and you can look around inside of the video (and each eye, in stereoscopic 3D, effectively doubles your resolution needs): you’re going to want all of these things together.

What are the storytelling implications of Lytro?

With these display advances, storytelling will change—just as it did when the basic black-and-white motion picture added sound, color, visual effects, and surround sound (I was going to add 3D to the list but there’s still so much backlash against it, and the technology is still in its infancy, that it’s not a great example… yet). Suffice it to say that the experience of watching Clouds Over Sidra on a VR headset is entirely different than watching it contained within the boundaries of small rectangle on a flat screen. The storytelling was informed by the technology.

In an era where video games and computer-generated animation are seemingly advancing faster technologically than traditional filmed live-action, Lytro shows there are still plenty of new techniques in the “live action” toolbox. But on its own a tool does not change anything; it will always come back to the same age-old question of, can you tell a story with it? Lytro is a newfangled, technically complex, eyebrow-raising, mind-blowing tool. But, as ever, what you do with it is up to you.

 

 

( Source: http://bit.ly/21imkmK )

 

Now You Can Listen To YouTube Songs On Your Android Phone With The Screen Turned Off!

Wherever you are, whatever the time, when you’re looking for a song to listen to, you inevitably find yourself on YouTube. While it is the repository of some of the craziest, funniest and scariest videos on the net, YouTube is also the go-to app for anyone who’s looking for a song.

u

But if there’s any complaint that Android users have with listening to songs on YouTube it’s this: the app keeps the screen on while you’re playing the song.

Now this, as any smartphone user will tell you, is not good for any already overworked battery. But now there’s an app that will help you get around that.

Presenting the Black Screen of Life , an app that turns your screen off when the proximity sensor is blocked. So when you’re listening to music on YouTube, all you need to do cover the proximity sensor and hey presto, you’ve saved bucketfuls for power.

 

Bioluminescent Forest Created Using Projectors

The projection mapping ‘bioluminescent forest’ is made by artists Friedrich van Schoor and Tarek Mawad.

The artists spent six weeks in the forest fascinated by the silence and natural occurrences in nature, especially the phenomenon “bioluminescence”. They personified the forest to accentuate the natural beauty by creating luring luminescent plants and glowing magical mushrooms that speaks volumes to any visitor that enters the minds of the artists through viewing ‘bioluminescent forest’.
1 2 3

Below you will find behind-the-scenes photos as well as a ‘making of’ video.

 

 

(Source: http://www.comedytrash.com)

 

The first Doctor Strange trailer is here and we see possibilities

The first trailer for Marvel’s Doctor Strange is here, and as promised, it’s quite a trip. Benedict Cumberbatch unveiled the movie’s first trailer on Jimmy Kimmel Live! Tuesday night, and by far the most alarming part of teaser is hearing the Brit speak with an American accent.

Directed by Scott Derrickson, Doctor Strange tells the story of Stephen Strange (Cumberbatch), an acclaimed neurosurgeon who uncovers the hidden world of magic and alternate dimensions after a near-fatal car accident. As you can see in the trailer, this film will expose parts of the Marvel Cinematic Universe that have been previously unexplored. (Think Inception for the MCU.)
We’re already intrigued by Tilda Swinton, who portrays Strange’s kickass mentor, the Ancient One. “You’re a man looking at the world through a keyhole,” she tells Strange in the trailer. “You’ve spent your life trying to widen it. Your work saved the lives of thousands. What if I told you that reality is one of many?”

 

Notably, we also get a glimpse of Rachel McAdams in the two-minute teaser, and it looks like she’s playing a nurse — a night shift nurse, perhaps? Time, and maybe a bit of sorcery, will indeed tell. Still, we need 100 percent more Chiwetel Ejiofor.

Nike Just Made This Remarkable Farewell For Kobe Bryant

Love him or hate him, Kobe Bryant is a legend.

In China, they mostly love him—he’s made a concerted effort, with Nike, to reach out to his Chinese fans over the past decade. And the Chinese have responded with adulation all but unmatched for American sports stars.

Now, with Bryant’s last game fast approaching, Nike and Wieden + Kennedy Shanghai have created a stirring 60-second tribute commercial. And it’s all about that love—which Bryant says might actually be a bit misguided.

“Kobe has an intimate relationship with the Chinese ballers, so he knows exactly how to teach and motivate them,” says Terence Leong, creative director of W+K Shanghai. “Together with Nike China and Kobe, our team crafted the script and made sure the film was just as provocative as the man himself. It was an intense and uncompromising process because Kobe was just as demanding on the creative team as he was on the Lakers.”

W+K creative director Azsa West adds: “[Kobe] chose to focus on becoming a legend rather than being a hero. When it comes to winning, Kobe is willing to push himself to risk everything. Because standing back and doing nothing, that’s real failure. This philosophy is very Nike ‘Just do it,’ and Kobe is the perfect person to deliver this spirit of Nike.”

CREDITS
Client: Nike
Campaign: “Kobe Last Season”
Spot: “Don’t Love Me, Hate Me”
Launch Date: 7 April 2016

Agency: Wieden + Kennedy Shanghai | Executive Creative Director: Yang Yeo | Creative Directors: Terence Leong, Azsa West
Copywriters: Nick Finney, Wei Liu | Senior Art Director: Shaun Sundholm | Senior Designers: Patrick Rockwell, Will Dai
Integrated Production Director: Angie Wong | Assistant Producers: Yuan Fang, Jiji Hu | Offline Editor: Hiro Ikematsu
Business Director: Dino Xu | Associate Account Director: Jim Zhou | Account Executive: Shawn Kai | Senior Planner: Paula Bloodworth
Digital Strategist: Bill Tang | Project Manager: Nicole Bee | Business Affairs: Jessica Deng, Kathy Zhan

Production Companies: Elastic TV; Lunar Films | Director: Biff Butler | Line Producer: Kelly Christensen
Director of Photography: Rachel Morrison | Executive Producer (Elastic): Belinda Blacklock | Managing Director (Elastic): Jennifer Sofio Hall
Executive Producer (Lunar): Ken Yap | Post Producer (Lunar): Jeff Tannebring

Editing: Rock Paper Scissors | Editors: Biff Butler, Alyssa Oh | Post Producer: Christopher Noviello
Executive Producers: Angela Dorian, Linda Carlson

Postproduction: a52 | 2-D Visual Effects Artists: Michael Vaglienty, Adam Flynn
Smoke Artist: Chris Riley | Conform: Gabe Sanchez | Rotoscope Artists: Tiffany German, Cathy Shaw, Robert Shaw
Colorist: Paul Yacono | Design: Pete Sickbert-Bennett | 2-D, 3-D Animation: David Do, Steven Do, Claudia De Leon, Sam Cividanis
Senior Color Producer: Jenny Bright | Producer: Drew Rissman | Head of Production: Kim Christensen
Deputy Head of Production: Carol Salek | Executive Producer: Patrick Nugent

Original Music, Sound Design, Mix: Lime Studios | Original Music: Andy Huckvale
Mixer: Zac Fisher | Assistant Mixer: Kevin McAlpine | Executive Producer: Susie Boyajan

Source: Adweek

5 Twitter Growth Tools You Should Be Using Now

Are you looking to be the best social media manager / growth hacker you can be? Well today I’ve posted my 5 favorite tools to grow and manage your Twitter accounts. By using these tools correctly, you WILL grow your Twitter account.

  1. ManageFlitter
    If you’re looking to grow your Twitter account, I strongly recommend checking out this tool. ManageFlitter comes with several key features that make it essential for the Twitter growth hacker. To start, it allows you to find all those people who don’t follow you back so you can easily unfollow them. If they are not following you, they are not seeing your tweets anyways. You can also use the Powermode which allows you to create a set of search filters so you can find users that are most likely to follow you back and would be engaged. You can take it a step further and use the Power Post feature to schedule and track tweets so they are sent during times when your audience is most online. Lastly, it comes with analytics so you can track growth. Believe me, if you use it right, you will see growth in your account.

2. SocialBro
SocialBro is a great tool to understand your audience. It provides some great analytics to break down your audience. Aside from that, they also allow you to automate replies so you can automate the introduction if someone were to follow you. It’s called rules, try them out! This is also a great tool to grow you Instagram account as well, but that’s a different story.

3. Twibble.io
A part of growing a Twitter account is the ability to post regular great content. Twibble is a service that automatically posts content from any RSS feed to Twitter. Unlike other RSS to Twitter services, this beautifully designed, posts an image, has analytics, hashtag testing and more. You can also automate posting Instagram images or Pinterest pins if you’re looking to make your Twitter account super visual!

4. Buffer
Similar to Twibble, Buffer allows you to schedule out content. It’s a super powerful tool and should be used by any social media manager. It saves you so much time! It’s also great because it has some analytics features that show you what content that you post did the best, and makes it easy to re-post. Best used with evergreen content!

5. CoSchedule
If you manage a blog, you’ll want to check out CoSchedule. This tool makes it easy to manage an editorial calendar with multiple team members. It should be the first line of attack when posting content from your blog. It will allow you to schedule your social posts ahead of time and also provides insight into what content does the best and allows you to keep sharing easily.