Connect with us

Tech

Big Tech Wants Points for Jobs

Published

on

[ad_1]

This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.

Here’s one more way that technology companies are becoming more like conventional corporations: When they talk about jobs, it’s often a political message.

Google last week detailed its expansion of offices, computer data centers and staff around the United States. The company didn’t say so, but it needs more people, buildings and infrastructure to keep growing and making money. It’s smarter politics and public relations to rebrand it as “investing in America.”

Google is not alone. Amazon has turned its mammoth work force into its loudest political message that the company is helping Americans and the economy. The iPhone manufacturer Foxconn keeps promising high-tech jobs at its Wisconsin factory, even though it hasn’t delivered on three years of hiring promises. Facebook and Apple regularly talk about how they support small businesses and help generate jobs at app companies.

Growing corporations are engines of economic growth, and it’s nothing new for them to brag about what they’re doing for political reasons. Defense contractors might suggest to members of Congress that cutting the Pentagon’s budget could lead to fewer jobs in a lawmaker’s district or state. Walmart tallies how much it buys from American suppliers.

But it’s still odd to see tech companies playing this same game of corporate soft power. This was an industry that for a long time said it didn’t need to do the usual corporate muck of lobbying and courting political power. This was never really true, but it’s gotten even less so.

As more people and politicians worry about the influence of technology companies in the economy and our lives, digital corporations have been forced to try harder to keep people feeling warm and fuzzy about them. One way to do that is to copy what boring old companies have always done: Get attention for their hiring and growth.

Amazon is the epitome of a company that uses its hiring and economic growth as a tool to influence how others perceive it. My colleague Karen Weise has written about Amazon’s using its growing staff of 1.3 million people as a force of political persuasion.

Workers at Amazon warehouses go to Washington to meet with members of Congress and give lawmakers safety vests with the names of the company’s warehouses in their districts. Amazon regularly talks up its job openings and new warehouses and offices, and it has a website that tallies how much the company spends in the United States.

It’s a compelling message. Few companies in the history of the United States have hired people at the rate Amazon has recently. And many towns and states want Amazon facilities in their backyards — and politicians want credit for bringing those jobs to their area.

It’s also undeniable that all that spending is for Amazon, not for America. The company’s sales are growing fast, and its commitment to get more packages to Prime members’ doorsteps in one day has required it to add workers, open more depots near major population centers and spend more on planes and trucks.

The desire to paint corporate necessity in the best possible light sometimes creates strange spectacles. Apple in 2018 basically patted itself on the back for paying taxes and buying equipment to make iPhones.

Tech companies are becoming just like every other for-profit corporation. They want to be seen as contributing to society, not just making money.


Tip of the Week

This tip from Brian X. Chen, The New York Times’s consumer technology columnist, made me immediately check my phone settings:

Many of us rely on our smartphones for our everyday cameras. But our phones collect lots of data about us, and camera software can automatically make a note of our location when we snap a photo. This is more often a potential safety risk than a benefit.

Let’s start with the positives. When you allow your camera to tag your location, photo management apps like Apple’s Photos and Google Photos can automatically sort pictures into albums based on location. That’s helpful when you go on vacation and want to remember where you were when you took a snapshot.

But when you’re not traveling, having your location tagged on photos is not great. Let’s say you just connected with someone on a dating app and texted a photo of your dog. If you had the location feature turned on when you snapped the photo, that person could analyze the data to see where you live.

Just to be safe, make sure the photo location feature is off by default.

To do this on iPhones: Open the Settings app, select Privacy, then Location Services and finally, Camera. Under “Allow Location Access,” choose “Never.”

On Androids, inside the Camera app, tap the Settings icon that looks like a gear cog. Scroll to “tag locations” or “save location,” and switch the toggle to the off position.

You might choose to turn the location feature on temporarily to document your vacation, but remember to turn it off when your trip is over.


  • A very valuable chat app: My colleagues reported that Discord, a messaging app that is popular for group video games, has discussed selling the company to Microsoft. A sale may never happen, but the price that was discussed was more than $10 billion.

  • Meet Dr. Zoom: Some medical schools have held cadaver dissection by simulation software during the pandemic, and, yes, it’s as weird as it sounds. My colleague Emma Goldberg talked to physicians in training about how they’ve adapted to virtual learning in what is typically very hands-on education.

  • Want to feel old and irrelevant?! Ryan Kaji is 9. His family generates $30 million in annual revenue from YouTube channels of Ryan opening new toys, exercising and doing craft projects. His family told Bloomberg News that the real money from those videos comes from sales of related merchandise like branded toys and clothes.

It’s officially spring here in the Northern Hemisphere. Chill out to this stunning video of robins. (This was recommended by The New York Times Cooking newsletter.)


We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.

If you don’t already get this newsletter in your inbox, please sign up here.

[ad_2]

Source link

Continue Reading

Tech

Whole Foods will soon let customers pay for groceries with palm scan

Published

on

By

[ad_1]

Whole Foods will soon let customers pay for groceries using its parent company’s palm-scanning technology.

Amazon said Wednesday its palm-scanning system — currently used in about a dozen of its brick and mortar stores — will debut at a Whole Foods in Seattle’s Capitol Hill neighborhood, the first of many planned rollouts at other locations.

The system uses Amazon One technology, which employs high-tech imaging and algorithms to create and detect a “unique palm signature” based on the ridges, lines and veins in each person’s hand.

Its high-tech sensors don’t require users to touch the scanning surface, like Apple’s fingerprint technology does.

Instead, palm-reading tech uses computer vision and depth geometry to process and identify the shape and size of each hand they scan before charging a credit card on file.

Amazon One will debut at a Whole Foods in Seattle's Capitol Hill neighborhood, with many rollouts at other locations planned for the future.
Amazon One will debut at a Whole Foods in Seattle’s Capitol Hill neighborhood, with many rollouts at other locations planned for the future.
Shannon Stapleton/Reuters

The company said that the palm-scanning tech will be offered as just one of many payment options at participating Whole Foods Stores and that it won’t impact store employees’ job responsibilities.

“At Whole Foods Market, we’re always looking for new and innovative ways to improve the shopping experience for our customers,” said Arun Rajan, senior vice president of technology and chief technology officer at Whole Foods Market.

Palm images used by Amazon One are encrypted and stored in a “highly secure” cloud, and customers can request to have their palm data deleted.

The company claims palm-scanning tech is more private than other biometric alternatives, such as facial recognition.

Amazon One builds on the “Just Walk Out” technology that Amazon uses in its Go stores, which detects the items shoppers pick up and charges them once they leave — without the need for a checkout line

Amazon is also planning to expand the cashier-less technology to Whole Foods, as reported by The Post.

Meanwhile, the tech could be good for its bottom line. The online behemoth aims to sell its palm-scanning tech to other companies like retailers, stadiums and office buildings.

Amazon One scanner
The scanner uses high-tech imaging and algorithms to create and detect a unique palm signature which is then encrypted and stored in a secured cloud.
Amazon

Last September, it said it was in “active discussions with several potential customers.” But it is unclear if it has progressed on any of those fronts.

[ad_2]

Source link

Continue Reading

Tech

Apple’s new iPad Pros and TV remote don’t have U1 locators to help find them in your couch

Published

on

By

[ad_1]

Apple has been quietly sticking special locator beacon chips into some of its new iPhones that’ll let you unlock your car and find lost items through walls — the latter thanks to the $29 AirTags announced today — but sadly, you won’t find that chip in the new M1-based iPad Pros or the long-awaited new Siri remote for the Apple TV.

Apple confirmed to us that the U1 locator chip, which uses pulses of ultra-wideband (UWB) radio to broadcast its precise location, won’t appear in the Siri remote. We’re waiting on final bulletproof confirmation about the iPad Pros, but it also doesn’t appear in their product page, spec sheet, or press release. Last year’s iP ad Pros didn’t include a U1 chip, either.

Is Apple expecting us to stick AirTags to our iPads and TV remotes to escape the jaws of the ever-ravenous couch? Unlikely, but the company has been pretty choosey about which devices get the chip so far. You can find it in the iPhone 11 and newer (but not the iPhone SE) and the Apple Watch Series 6 (but not the Apple Watch SE), but we’re pretty sure it hasn’t made its way to any iPads or MacBooks that have been announced since the chip’s introduction in September 2019.

Theoretically, Apple could build an ecosystem where any Apple device can easily find any other Apple device (not to mention UWB devices from Samsung, which is also deeply invested in the tech and has its own AirTag-like device as well). But for now, you’ll primarily just be using your phone to find AirTags, not other gadgets, except perhaps your future car.

[ad_2]

Source link

Continue Reading

Tech

Your iPhone has a completely hidden app. Here’s how to find and use it

Published

on

By

[ad_1]

Apple’s iPhone is full of hidden features and tricks we’re constantly discovering. For instance, did you know the Notes app has a hidden document scanner? Yeah, pretty cool. The latest hidden feature that’s been popping up on Twitter and blogs is another type of scanner, dedicated to QR codes, and it’s better than the one built into the camera app.

Indeed, you would already be able to filter QR codes utilizing the easy route in Control Center, or simply open the camera application and it will check a QR code. Also, you’re correct. Both of those strategies turn out great. However, the committed Code Scanner application accepts the position above and beyond by introducing a greater amount of the data I need to see about an examined code.

For instance, the camera application utilizes a little notice at the highest point of the screen to open a connection or show you data, though the devoted Code Scanner application makes it exceptionally clear what’s inside the QR code you just checked. Yet, here’s the rub: The Code Scanner application isn’t found on your home screen, nor is it found in iOS 14’s new App Library.

As should be obvious, the best way to discover the Code Scanner application is to utilize the iPhone’s Spotlight search include. Go to your iPhone’s home screen and swipe down in the center of the screen. An inquiry bar will show up at the highest point of your screen, alongside application and alternate route ideas underneath. Type either code or scanner. As you type, you’ll see the Code Scanner application symbol appear as an application idea. Tap to open it.

The flashlight icon at the bottom of the screen acts as a flash to illuminate a code if your phone is struggling to read it.

If you don’t have the QR scanner shortcut added to Control Center yet, here’s a post showing you how to customize Control Center to your liking. For more hidden features, check out our list for iOS 14. We also cover more general, but useful features in iOS 14.

[ad_2]

Source link

Continue Reading

Trending