Connect with us

Tech

Review: Living the ‘Dream,’ on Your Laptop or Phone

Published

on

[ad_1]

Do you know of a site where the wild thyme blows? You do now.

“Dream,” an interactive experience from the Royal Shakespeare Company, which runs through Saturday and lasts about as long as a power nap, transports its thousands of viewers to a sylvan grove, then to a rehearsal space in Portsmouth, England, for a live Q&A. Tickets are free, though those who prefer a lightly interactive experience can purchase seats for 10 British pounds (about $14) and appear onscreen as fireflies.

Inspired by Shakespeare’s “A Midsummer Night’s Dream” — in the wispiest, most gossamer way imaginable — “Dream” signifies a bounding leap forward for theater technology and a short jog in place for theater itself.

A different “Dream” was meant to open in Stratford-upon-Avon about a year ago, as a showcase for Audience of the Future, a consortium of institutions and tech innovators assembled in 2019 and tasked with exploring new ways to make and deliver theater remotely. (Theater on your phone? They saw it first.) The 2020 “Dream” would have played to both a live audience and a remote one, integrating actors, projections and live motion-capture into a verdant whole.

But in-person audiences are rare these days, and this remote “Dream,” however gorgeous — and it is gorgeous, enormously gorgeous — feels thinner for it, less a forest of imagination and more a small copse of some really lovingly rendered trees. It begins with Puck (E.M. Williams), that merry wanderer of the night, imagined here as an assemblage of pebbles in the approximate shape of a human body. Why render Puck — nimble, fleet and girdling the earth in the time it takes most of us to load the dishwasher — as a pile of rocks? Dunno. Looks cool.

In traveling around the forest, Puck encounters Shakespeare’s other fairies, like Moth (an accumulation of moths), Peaseblossom (sticks and flowers) and Cobweb (an eyeball inside a squirrel’s drey). Apparently, Puck also met Mustardseed (more sticks?). I missed it. And the singer Nick Cave contributed some voice acting! I missed that, too.

“Dream,” performed live, is exquisite, denatured and almost entirely contentless. It isn’t quite theater, and it isn’t precisely film, though it could pass for a highbrow “Avatar” short. For stretches, it resembles a meditative video game, but it isn’t that either, mostly because the interactive elements (clicking and dragging fireflies around the landscape) are wholly inconsequential.

Watching it, I felt inexplicably cranky, like a toddler who has been offered a variety of perfectly nice snacks but doesn’t want any of them. Because maybe what the toddler really wants is to safely see an actual play in an actual theater with an actual audience. And that just isn’t available right now.

So I don’t really know what to say about “Dream.” Because it represents an obviously fruitful and seemingly happy collaboration among top-of-their-game actors, directors, designers, composers and technicians, many of whom assumed some physical risk in the making of it. (Among them are Robin McNicholas, credited with direction and narrative development; Pippa Hill, credited with script creation and narrative development; and Esa-Pekka Salonen, the production’s music director and co-composer.) It also signals real progress in the use of live motion-capture (something the Royal Shakespeare Company has already experimented with) and offers a tantalizing glimpse of how that technology might be used when proper in-person theater returns.

But this isn’t proper theater. Or even improper theater. It’s a sophisticated demonstration of an emergent technology. Shakespeare is the pretext, not the point. The pentameter, pushed into random virtual mouths, helps us better appreciate the software architecture — which is great if you like software and less great if you like the language itself, or the original play’s plot or characters or keen insights into our big, dumb, desiring hearts. This “Dream” is beautiful. Wouldn’t it be nice if we could all wake up now?

Dream
Through March 20; dream.online

[ad_2]

Source link

Continue Reading

Tech

Whole Foods will soon let customers pay for groceries with palm scan

Published

on

By

[ad_1]

Whole Foods will soon let customers pay for groceries using its parent company’s palm-scanning technology.

Amazon said Wednesday its palm-scanning system — currently used in about a dozen of its brick and mortar stores — will debut at a Whole Foods in Seattle’s Capitol Hill neighborhood, the first of many planned rollouts at other locations.

The system uses Amazon One technology, which employs high-tech imaging and algorithms to create and detect a “unique palm signature” based on the ridges, lines and veins in each person’s hand.

Its high-tech sensors don’t require users to touch the scanning surface, like Apple’s fingerprint technology does.

Instead, palm-reading tech uses computer vision and depth geometry to process and identify the shape and size of each hand they scan before charging a credit card on file.

Amazon One will debut at a Whole Foods in Seattle's Capitol Hill neighborhood, with many rollouts at other locations planned for the future.
Amazon One will debut at a Whole Foods in Seattle’s Capitol Hill neighborhood, with many rollouts at other locations planned for the future.
Shannon Stapleton/Reuters

The company said that the palm-scanning tech will be offered as just one of many payment options at participating Whole Foods Stores and that it won’t impact store employees’ job responsibilities.

“At Whole Foods Market, we’re always looking for new and innovative ways to improve the shopping experience for our customers,” said Arun Rajan, senior vice president of technology and chief technology officer at Whole Foods Market.

Palm images used by Amazon One are encrypted and stored in a “highly secure” cloud, and customers can request to have their palm data deleted.

The company claims palm-scanning tech is more private than other biometric alternatives, such as facial recognition.

Amazon One builds on the “Just Walk Out” technology that Amazon uses in its Go stores, which detects the items shoppers pick up and charges them once they leave — without the need for a checkout line

Amazon is also planning to expand the cashier-less technology to Whole Foods, as reported by The Post.

Meanwhile, the tech could be good for its bottom line. The online behemoth aims to sell its palm-scanning tech to other companies like retailers, stadiums and office buildings.

Amazon One scanner
The scanner uses high-tech imaging and algorithms to create and detect a unique palm signature which is then encrypted and stored in a secured cloud.
Amazon

Last September, it said it was in “active discussions with several potential customers.” But it is unclear if it has progressed on any of those fronts.

[ad_2]

Source link

Continue Reading

Tech

Apple’s new iPad Pros and TV remote don’t have U1 locators to help find them in your couch

Published

on

By

[ad_1]

Apple has been quietly sticking special locator beacon chips into some of its new iPhones that’ll let you unlock your car and find lost items through walls — the latter thanks to the $29 AirTags announced today — but sadly, you won’t find that chip in the new M1-based iPad Pros or the long-awaited new Siri remote for the Apple TV.

Apple confirmed to us that the U1 locator chip, which uses pulses of ultra-wideband (UWB) radio to broadcast its precise location, won’t appear in the Siri remote. We’re waiting on final bulletproof confirmation about the iPad Pros, but it also doesn’t appear in their product page, spec sheet, or press release. Last year’s iP ad Pros didn’t include a U1 chip, either.

Is Apple expecting us to stick AirTags to our iPads and TV remotes to escape the jaws of the ever-ravenous couch? Unlikely, but the company has been pretty choosey about which devices get the chip so far. You can find it in the iPhone 11 and newer (but not the iPhone SE) and the Apple Watch Series 6 (but not the Apple Watch SE), but we’re pretty sure it hasn’t made its way to any iPads or MacBooks that have been announced since the chip’s introduction in September 2019.

Theoretically, Apple could build an ecosystem where any Apple device can easily find any other Apple device (not to mention UWB devices from Samsung, which is also deeply invested in the tech and has its own AirTag-like device as well). But for now, you’ll primarily just be using your phone to find AirTags, not other gadgets, except perhaps your future car.

[ad_2]

Source link

Continue Reading

Tech

Your iPhone has a completely hidden app. Here’s how to find and use it

Published

on

By

[ad_1]

Apple’s iPhone is full of hidden features and tricks we’re constantly discovering. For instance, did you know the Notes app has a hidden document scanner? Yeah, pretty cool. The latest hidden feature that’s been popping up on Twitter and blogs is another type of scanner, dedicated to QR codes, and it’s better than the one built into the camera app.

Indeed, you would already be able to filter QR codes utilizing the easy route in Control Center, or simply open the camera application and it will check a QR code. Also, you’re correct. Both of those strategies turn out great. However, the committed Code Scanner application accepts the position above and beyond by introducing a greater amount of the data I need to see about an examined code.

For instance, the camera application utilizes a little notice at the highest point of the screen to open a connection or show you data, though the devoted Code Scanner application makes it exceptionally clear what’s inside the QR code you just checked. Yet, here’s the rub: The Code Scanner application isn’t found on your home screen, nor is it found in iOS 14’s new App Library.

As should be obvious, the best way to discover the Code Scanner application is to utilize the iPhone’s Spotlight search include. Go to your iPhone’s home screen and swipe down in the center of the screen. An inquiry bar will show up at the highest point of your screen, alongside application and alternate route ideas underneath. Type either code or scanner. As you type, you’ll see the Code Scanner application symbol appear as an application idea. Tap to open it.

The flashlight icon at the bottom of the screen acts as a flash to illuminate a code if your phone is struggling to read it.

If you don’t have the QR scanner shortcut added to Control Center yet, here’s a post showing you how to customize Control Center to your liking. For more hidden features, check out our list for iOS 14. We also cover more general, but useful features in iOS 14.

[ad_2]

Source link

Continue Reading

Trending