Connect with us

Tech

‘A Midsummer Night’s Dream,’ Sprinkled With High-Tech Fairy Dust

Published

on

[ad_1]

“A Midsummer Night’s Dream” may be one of Shakespeare’s most performed plays — but its latest version from the Royal Shakespeare Company will be unlike any seen before. Titled “Dream,” the 50-minute streamed production fuses live performance with motion-capture technology, 3-D graphics, and interactive gaming techniques that let the audience remotely guide Puck through a virtual forest.

As live theater sprinkled with some seriously high-tech fairy dust, “Dream” promises to bring “a most rare vision” of the play to our screens, to borrow a line from Shakespeare. It will be available to watch online once a day at various times from Friday through March 20.

“It’s part of our ongoing engagement with this brave new world,” said Gregory Doran, the Royal Shakespeare Company’s artistic director. In 2016, the theater’s production of “The Tempest” used live motion-capture technology to create a 3-D digital avatar that was projected above the stage.

The difference this time is that everything in the play — the performers and their surroundings — will be rendered virtually.

A cast of seven will perform in a specially built studio in Portsmouth, southern England, wearing Lycra motion-capture suits outfitted with sensors. They will be surrounded by a 360-degree camera rig, made up of 47 cameras, with every movement almost instantaneously rendered by digital avatars, which are relayed to viewers via the stream. These magical figures move seamlessly through a computer-generated woodland, and the action is narrated in husky tones by the Australian singer-songwriter Nick Cave as the forest’s voice.

For audiences watching at home, the virtual fairies moving through a digital forest will look more like a video game or a CGI blockbuster than your average Royal Shakespeare Company show. But the performances are delivered live and in real time. Every night’s performance will be unique.

With its abridged running time and a much-reduced cast of characters, “Dream” is not a full-scale production of “A Midsummer Night’s Dream”; rather, it is a narrative inspired by it, focusing on Puck and the fairies. But don’t expect any cute digital wings: These are elemental, mysterious forces of nature.

The arts collective Marshmallow Laser Feast, which works with virtual, mixed and augmented reality, has created digital avatars for the actors so they look sprung from the natural world. Puck is formed of pebbles and stones, while Titania’s fairies are made up of moth wings, cobwebs, earth or roots. The fairies are shape-shifters that coalesce into recognizable human and animal forms onscreen, and grow or shrink so that they are small enough to “creep into acorn-cups,” as Puck puts it.

“It’s a form of puppetry,” said the Royal Shakespeare Company’s director of digital development, Sarah Ellis. “Those avatars come alive when they breathe, and how they breathe is through the live actor.”

The software that drives the performance, called Unreal Engine, is used across the video games industry and is behind popular titles like “Gears of War” and “Fortnite.” Since 2013, the company that developed it, Epic Games, has been branching out to create interactive 3-D content with the tool for film and TV, and, increasingly, for live events such as music festivals, museum exhibitions and theater productions.

Layering the tech with live performance, and relaying it instantly via a web player to thousands of devices, is an experiment for both Epic Games and the Royal Shakespeare Company. And then there’s the interactive component.

Up to 2,000 audience members for each performance can become part of the show, and will be invited to guide Puck through the forest. Onscreen, the chosen spectators will appear as a cloud of tiny fireflies: By using their mouse, trackpad or finger on the screen of a smart device, they will be able to move their firefly around the screen, and Puck will follow their lead through the virtual space.

“Without the fireflies — the audience — Puck wouldn’t be going anywhere,” said E.M. Williams, who plays the role. “The audience are very much the fuel, the energy, of the show.”

In a traditional stage production, the “tech” rehearsals come last, after weeks of work by the actors on character and narrative. For “Dream,” the process began with fittings for the motion-capture suits, so the players could calibrate their movements. Their digital avatars were refected on giant LED screens around the studio to orient the performers within the virtual environment.

“It looks so 3-D, like it’s coming out the screen sometimes,” Williams said of the computer-generated forest. “There are times when if I touch it, I expect to feel it. It’s thinning the veil between the technological world and the real world.”

The Royal Shakespeare Company has long been seen as a bastion of traditional British theater: reverent toward text and verse, powered by great actors. Did the company anticipate any resistance to its high-tech, experimental approach? Several reviewers said its motion-capture “Tempest” was gimmicky.

“There’ll be some criticism, of course,” said Doran, the company’s artistic director. But, he added, he hoped “Dream” could speak to a traditional theater audience, as well as viewers drawn in by the technology.

Besides, the genius of Shakespeare means his plays can take whatever new inventions are thrown at them. “It’s the same as an experimental production of any of these plays,” Doran said. “Shakespeare is robust: He’ll still be there.”

Dream
Presented online by the Royal Shakespeare Company, March 12-20; dream.online.

[ad_2]

Source link

Continue Reading

Tech

Whole Foods will soon let customers pay for groceries with palm scan

Published

on

By

[ad_1]

Whole Foods will soon let customers pay for groceries using its parent company’s palm-scanning technology.

Amazon said Wednesday its palm-scanning system — currently used in about a dozen of its brick and mortar stores — will debut at a Whole Foods in Seattle’s Capitol Hill neighborhood, the first of many planned rollouts at other locations.

The system uses Amazon One technology, which employs high-tech imaging and algorithms to create and detect a “unique palm signature” based on the ridges, lines and veins in each person’s hand.

Its high-tech sensors don’t require users to touch the scanning surface, like Apple’s fingerprint technology does.

Instead, palm-reading tech uses computer vision and depth geometry to process and identify the shape and size of each hand they scan before charging a credit card on file.

Amazon One will debut at a Whole Foods in Seattle's Capitol Hill neighborhood, with many rollouts at other locations planned for the future.
Amazon One will debut at a Whole Foods in Seattle’s Capitol Hill neighborhood, with many rollouts at other locations planned for the future.
Shannon Stapleton/Reuters

The company said that the palm-scanning tech will be offered as just one of many payment options at participating Whole Foods Stores and that it won’t impact store employees’ job responsibilities.

“At Whole Foods Market, we’re always looking for new and innovative ways to improve the shopping experience for our customers,” said Arun Rajan, senior vice president of technology and chief technology officer at Whole Foods Market.

Palm images used by Amazon One are encrypted and stored in a “highly secure” cloud, and customers can request to have their palm data deleted.

The company claims palm-scanning tech is more private than other biometric alternatives, such as facial recognition.

Amazon One builds on the “Just Walk Out” technology that Amazon uses in its Go stores, which detects the items shoppers pick up and charges them once they leave — without the need for a checkout line

Amazon is also planning to expand the cashier-less technology to Whole Foods, as reported by The Post.

Meanwhile, the tech could be good for its bottom line. The online behemoth aims to sell its palm-scanning tech to other companies like retailers, stadiums and office buildings.

Amazon One scanner
The scanner uses high-tech imaging and algorithms to create and detect a unique palm signature which is then encrypted and stored in a secured cloud.
Amazon

Last September, it said it was in “active discussions with several potential customers.” But it is unclear if it has progressed on any of those fronts.

[ad_2]

Source link

Continue Reading

Tech

Apple’s new iPad Pros and TV remote don’t have U1 locators to help find them in your couch

Published

on

By

[ad_1]

Apple has been quietly sticking special locator beacon chips into some of its new iPhones that’ll let you unlock your car and find lost items through walls — the latter thanks to the $29 AirTags announced today — but sadly, you won’t find that chip in the new M1-based iPad Pros or the long-awaited new Siri remote for the Apple TV.

Apple confirmed to us that the U1 locator chip, which uses pulses of ultra-wideband (UWB) radio to broadcast its precise location, won’t appear in the Siri remote. We’re waiting on final bulletproof confirmation about the iPad Pros, but it also doesn’t appear in their product page, spec sheet, or press release. Last year’s iP ad Pros didn’t include a U1 chip, either.

Is Apple expecting us to stick AirTags to our iPads and TV remotes to escape the jaws of the ever-ravenous couch? Unlikely, but the company has been pretty choosey about which devices get the chip so far. You can find it in the iPhone 11 and newer (but not the iPhone SE) and the Apple Watch Series 6 (but not the Apple Watch SE), but we’re pretty sure it hasn’t made its way to any iPads or MacBooks that have been announced since the chip’s introduction in September 2019.

Theoretically, Apple could build an ecosystem where any Apple device can easily find any other Apple device (not to mention UWB devices from Samsung, which is also deeply invested in the tech and has its own AirTag-like device as well). But for now, you’ll primarily just be using your phone to find AirTags, not other gadgets, except perhaps your future car.

[ad_2]

Source link

Continue Reading

Tech

Your iPhone has a completely hidden app. Here’s how to find and use it

Published

on

By

[ad_1]

Apple’s iPhone is full of hidden features and tricks we’re constantly discovering. For instance, did you know the Notes app has a hidden document scanner? Yeah, pretty cool. The latest hidden feature that’s been popping up on Twitter and blogs is another type of scanner, dedicated to QR codes, and it’s better than the one built into the camera app.

Indeed, you would already be able to filter QR codes utilizing the easy route in Control Center, or simply open the camera application and it will check a QR code. Also, you’re correct. Both of those strategies turn out great. However, the committed Code Scanner application accepts the position above and beyond by introducing a greater amount of the data I need to see about an examined code.

For instance, the camera application utilizes a little notice at the highest point of the screen to open a connection or show you data, though the devoted Code Scanner application makes it exceptionally clear what’s inside the QR code you just checked. Yet, here’s the rub: The Code Scanner application isn’t found on your home screen, nor is it found in iOS 14’s new App Library.

As should be obvious, the best way to discover the Code Scanner application is to utilize the iPhone’s Spotlight search include. Go to your iPhone’s home screen and swipe down in the center of the screen. An inquiry bar will show up at the highest point of your screen, alongside application and alternate route ideas underneath. Type either code or scanner. As you type, you’ll see the Code Scanner application symbol appear as an application idea. Tap to open it.

The flashlight icon at the bottom of the screen acts as a flash to illuminate a code if your phone is struggling to read it.

If you don’t have the QR scanner shortcut added to Control Center yet, here’s a post showing you how to customize Control Center to your liking. For more hidden features, check out our list for iOS 14. We also cover more general, but useful features in iOS 14.

[ad_2]

Source link

Continue Reading

Trending