Category Archives: Technology

Minecraft: UK business is looking to hire virtual landscapers

WhatShed, which calls itself the largest independent buyers guide for garden buildings in the UK, is looking for virtual landscape gardeners to expand their “gardening passions” to the virtual world and “provide professional advice to players looking to improve their in-game outdoor space,” according to the job description.

Applicants should be passionate about gardening and Minecraft and have creative flare. As an added bonus, the job is remote, so applicants do not have to live in the UK to be qualified.

“Successful consultants will get paid upwards of £50 ($70 US) an hour for their services when hired but will be able to set their own rates and work flexibly,” the listing says.

In addition to the job listing, WhatShed is also looking for clients interested in having a virtual landscaper come “work” on their virtual gardens.

The consultations will last about an hour and clients will be able to contact the same landscaper again if they want to change the design or layout.

The listing does not specify when candidates will be chosen or how long the listing will be up, but the company did note that they would not be responding to every application.

Read original article here

Magic the Gathering player accidentally opens a vintage pack and finds a rare Black Lotus

A man looking to diversify his investments from stocks to Magic the Gathering cards has found the highly prized Black Lotus card after accidentally opening a vintage pack. 

Talking to PC Gamer, the man – known only as Michael – used to play the game in the mid-90s and invested an unknown amount – thought to be anywhere from $7,000 to $15,000 – into an unopened MTG card pack. 

Even though his pack couldn’t contain the sought-after Alpha Black Lotus, the Beta variant is still highly collectible as it too will never be reprinted. 

“I had read that you could search a pack and see the contents without opening,” Michael told PC Gamer. “I wanted to search it and if it had good cards in it, I would open. If not I would just hold it and let it appreciate in value as a sealed booster pack or keep as a piece of history.” 

But after accidentally breaking the seal after trying to get a sneaky peek at its content, Michael found a rare prize indeed: the Beta Black Lotus card, which can be worth tens of thousands of dollars depending upon its condition. 

“I literally blank stared at it for a few seconds. My brain was full-on loading screen,” Michael said. “The first coherent thought I remember having was, ‘No. This is fake. This HAS to be fake. There is no way…’ 

“After that, it was a blur of me trying to find my card bag that had cases in it and yelling for my life partner to stay away from the table,” he said. “She thought something was wrong based on how I sounded. You have to realize, I was literally in my underwear running around the house trying to find a case to put this thing in!”

Michael is now looking to have the card professionally graded.

A rare Magic: The Gathering card recently sold at auction for over half a million dollars

The game’s most powerful and sought-after card, Black Lotus – which was released as part of the Alpha set in 1993 – popped up on eBay earlier this month and closed at the end of January, having sold for an astonishing $511,100. That’s three times the price paid for a similar quality card in 2019, and five times the price paid in 2018.

The steep price is thought to have been achieved due to the card’s mint condition – rated as “MT 10” – and the fact it’s been signed by artist Christopher Rush, who sadly died back in 2016.

ICYMI, the first Magic: Legends open beta will begin on March 23, giving players their first taste of the free-to-play, deck-building action-RPG.

You can sign up for the beta via the official Magic: Legends website. All you have to do is make a free Arc account and click the button in the top-right corner. Ahead of the beta, you can also put in for a slot in the final closed alpha.

Alpha and beta testing is exclusive to PC, but PS4 and Xbox One players will get to join in the full release later this year. In a new deck-building deep-dive, Perfect World gave us a better idea of what to expect from Magic: Legends, and a clearer picture of how it adapts the rules of Magic: The Gathering. 

Here are our picks of the best card games, best board games, and the best tabletop RPGs right now.

Read original article here

Google will face lawsuit over Incognito mode tracking

Google now has no choice but to deal with a lawsuit over Incognito mode tracking. As Bloomberg reports, Judge Lucy Koh has denied Google’s request to dismiss the class action case. Koh determined that Google “did not notify” users it was still collecting data while Incognito’s privacy mode was active, giving the plaintiffs enough ground to move forward with their case.

The chief participants in the lawsuit had accused Google of misleading users, telling them their info was private even as it monitored their habits. The search giant had argued that users agreed to its privacy policy and thus knew Google was collecting data. It reportedly warned that Incognito “does not mean ‘invisible'” and that sites could still see activity. 

We’ve asked Google for comment.

It’s unclear whether or not the lawsuit will succeed, let alone that there will be meaningful changes or compensation. Successful class actions frequently lead to payouts that represent a fraction of the damage to customers. Incognito mode’s limitations are well-known among enthusiasts — it’s really there to keep sites out of your local search history and cookies, not to block all potentially identifying traffic.

It’s not clear the general public is aware of Incognito’s true behavior, though. The lawsuit could force Google to more explicitly tell users what it does and doesn’t collect. The complaint also serves as criticism of companies that bury important information in their terms of service. Few people read those agreements from start to finish, and that can cause problems when privacy is at stake.

Read original article here

Judge rules Google has to face lawsuit that claims it tracks users even in Incognito mode

A judge in California ruled Friday that Google has to face a class action lawsuit that claims the search giant secretly collects data from users even when they’re using its private “Incognito” mode, Bloomberg reported.

Three users filed a complaint last June alleging Google has a “pervasive data tracking business,” and its tracking persists even if users take steps to protect their private information, such as using incognito mode in Chrome, or private browsing in Safari and other browsers. The lawsuit seeks at least $5 billion.

Google had sought to have the case thrown out, but US District Judge Lucy Koh wrote in her ruling that the company “did not notify users that Google engages in the alleged data collection while the user is in private browsing mode.”

The company said in a court filing that it makes clear to users “that ‘Incognito’ does not mean ‘invisible,’ and that the user’s activity during that session may be visible to websites they visit, and any third-party analytics or ads services the visited websites use.”

Google spokesperson José Castañeda said in an email to The Verge on Saturday that the company disputes the lawsuit’s claims “and we will defend ourselves vigorously against them.” He added that Chrome’s Incognito mode gives users the choice to browse the internet without activity being saved to their browser or devices. “As we clearly state each time you open a new incognito tab, websites might be able to collect information about your browsing activity during your session.”

Google said earlier this year it is phasing out third-party tracking cookies, and says it doesn’t plan to replace the cookies with something that may be as invasive even though it will affect the company’s advertising business.

Update March 13th, 4:39PM ET: Adds statement from Google spokesperson

Read original article here

Chrome OS 89 features: hands-on and walkthrough [VIDEO]

Chrome OS 89 is here and available to most Chromebook users, so we wanted to take a few minutes and walk through all the notable additions to the OS this time around. In the history of Chrome OS versions, I can think of few that launched with so many new tricks and updates. There is so much on offer with Chrome OS 89 that we wanted to spend a few days using all the new features as they appear in the Stable Channel and put together a quick video highlighting each of the new, fun ways you can use your Chromebook.

From screen recording to media controls to the Phone Hub, Chrome OS 89 feels like a culmination of many months of additions, testing, and work to make Chromebooks far more capable and easy to use. Gabriel already posted a rundown of all the new stuff on offer, so if you’d like to read a list and dig a bit into each new feature, head over to his post. For this post, we simply wanted to put all these features to the test and show them to you on video. We know sometimes these announcements happen and really great features are simply forgotten and underutilized. Seeing them on screen, we hope you find a heightened interest and go try each of these new additions for yourself on your own Chromebook.

Advertisements

Finally, I wanted to also point out that as a part of this launch coinciding with the celebration of Chrome OS’ 10th birthday, we’ve partnered with Google to give away 20 limited edition Chromebook sleeves and 20 exclusive sticker packs. We’ve already hit the first 5 given away and will continue giving away 5 sleeves and sticker packs to 5 lucky winners each week until they are gone. If you’ve not already, be sure to get entered to win via the big button below and good luck!

Advertisements

Read original article here

EA suspends all discretionary FIFA Ultimate Team content granting indefinitely amid “EA Gate” scandal • Eurogamer.net

EA has suspended all discretionary content granting indefinitely amid the ongoing “EA Gate” scandal that has rocked the FIFA series.

Earlier this week Eurogamer reported on how the FIFA community had unearthed direct messages that appeared to show an EA employee selling coveted Ultimate Team cards for thousands of pounds on the black market.

These direct messages mentioned FUT Icon cards in packages priced 750-1000 euros. In one WhatsApp message, three Prime Icon Moments cards were offered for 1700 euros.

This content is hosted on an external platform, which will only display it if you accept targeting cookies. Please enable cookies to view.

This content is hosted on an external platform, which will only display it if you accept targeting cookies. Please enable cookies to view.

This content is hosted on an external platform, which will only display it if you accept targeting cookies. Please enable cookies to view.

Icon cards are among the most sought after in FIFA Ultimate Team. They include legendary players such as Brazilian Ronaldo, Pele, Ronaldinho, Zinedine Zidane and Ruud Gullit, and are near impossible to obtain through the mode’s controversial loot boxes.

Even rarer are Prime Icon Moments – special versions of Icon cards that mark one game or tournament that was special for the players.

EA launched an investigation in response, and overnight provided an update:

“Earlier this week, we were made aware of suspicious activity relating to highly rated content in FIFA Ultimate Team,” EA said.

“We learned that FUT items were granted to individual accounts that did not earn them through gameplay – i.e. by opening a pack, purchasing through the transfer market, completing a reward challenge (e.g. an SBC completion) or other engagement (e.g. viewing a Twitch Broadcast).

“It appears that one or more EA accounts, which were either compromised or being used inappropriately by someone within EA, directly entitled items to these individual accounts.

“The alleged behaviour is unacceptable and in no way do we condone granting or purchasing player items in exchange for money. This practice runs counter to the game’s competitive integrity, is a violation of EA’s User Agreement, and is not something we tolerate. We do not allow the trade or sale of items outside our game for many reasons, including that it would create an unequal playing field for our community.”

Of course, EA does grant purchasing player items in exchange for money – via loot boxes.

Eurogamer news cast: what Xbox’s Bethesda exclusives mean for the future.

EA insisted its initial investigation has shown questionable activity involving “a very small number of accounts and items”, but despite this, called the alleged activity “unacceptable”.

EA then vowed to take action against any employee found to have been engaging in this activity, to remove any items granted from the FUT ecosystem, and permanently ban any player known to have bought them.

EA then apologised to the FIFA community: “Regardless of these actions, we appreciate how concerning this is to all of our players, and we apologise for the impact of these improper grants within the community.

“We also appreciate how extremely annoying and frustrating it is that this practice might have come from within EA. We’re angry too. We know that the trust of our communities is hard-earned, and is based on principles of Fair Play. This illicit activity shakes that trust. We’ve also been clear since the creation of Ultimate Team that items cannot be exchanged outside our game, and that’s key to how we keep our game safe from manipulation and bad actors. This is a breach of that principle, as well – and we won’t let it stand.”

EA goes on to discuss content granting – something it rarely does. This is when EA gives Ultimate Team content to player accounts. EA said that unless these items are issued to replace lost content, they are usually non-tradeable items, which means they have no exchange value in the game, cannot be sold on the transfer market and cannot be shared with other players. Examples include items used in testing and quality verification, and discretionary content granting to athletes, EA’s partners and employees.

Every now and then the community will discover a famous footballer’s Ultimate Team because they’ve run up against them online, and you’ll sometimes see the footballers use a special 99-rated version of themselves in-game. That’s discretionary content granting to athletes.

“The items granted on a discretionary basis to these partners or employees are always non-tradeable and can be used only by the account to which they were originally granted,” EA insisted. “We do not use this discretionary process to grant content to professional video game influencers.”

Of course, this content can be used in Ultimate Team’s competitive, pay-to-win multiplayer against other players online – a balance issue EA does not address in its statement.

EA said the sum of items granted through these three scenarios (customer experience, testing and partners) combined is less than 0.0006 per cent of the total player items in the FIFA 21 ecosystem. EA insisted these grants have no impact on the odds of any player in the ecosystem acquiring these players, they have no bearing on the overall volume of available content, and all content that is granted is untradeable, with no associated coin value.

“Obviously, the actions being alleged in this case fall far outside of these legitimate scenarios for granting content,” EA said.

EA’s investigation is ongoing, but the company said it has narrowed how this happened and identified the accounts that have received the content. Meanwhile, EA has suspended all discretionary content granting for an indefinite period.

“Once again, we highly value and appreciate the commitment and support of the FIFA community in helping identify this issue and will continue to provide updates as the investigation progresses towards conclusion,” EA said.

One of the many issues with Ultimate Team the “EA Gate” scandal highlights is the artificial scarcity of these highly-coveted items. Some of the most powerful and most sought after cards in the game – the Prime Icon Moments versions of Ronaldo, Pele, Ronaldinho, Zinedine Zidane and Ruud Gullit, for example – have a below one per cent chance of dropping from a loot box. The exact probability of getting an Icon from a pack is unknown, because EA does not disclose exact percentage chances below one percent – a lack of transparency that has been criticised for some time.

EA is already facing two other lawsuits connected to Ultimate Team, one in the United States alleging the mode breaks California state gambling laws, and one in Canada accusing EA of running “an unlicensed, illegal gaming system through their loot boxes”.

Loot boxes have come under increased scrutiny from government authorities in recent years, too, particularly in relation to their impact on young people. In January 2019, EA stopped selling FIFA Points in Belgium following government pressure over loot boxes. The Netherlands Gambling Authority has also declared loot boxes illegal because they are considered a game of chance, and therefore violate the country’s Gambling Act. The Dutch authorities ended up issuing EA with a fine of up to €10m over loot boxes in FIFA.

FIFA loot boxes are currently not considered a form of gambling in the UK, although the government is taking a close look at them in that context. In July, the House of Lords gambling committee urged the government to “act immediately” to regulate them. The Department for Digital, Culture, Media and Sport launched a consultation on loot boxes in September and a review of the Gambling Act 2005 in December last year. It is due to publish a white paper before the end of this year.

FIFA hit the mainstream headlines last weekend with a Sunday Times investigation titled “FIFA’s ugly game lures teens to gamble”.

EA has called FIFA loot boxes “surprise mechanics”, and in a statement given to the Sunday Times, likened them to Kinder eggs. A spokeswoman told the paper there was “no advantage to purchasing Ultimate Team packs rather than earning them”, and that the majority of player packs were awarded through in-game accomplishments. She said users could track or limit their spend through FIFA Playtime, a new in-game tool, and said access to online gameplay could be restricted using parental controls on consoles.

// Load the SDK Asynchronously (function (d) { var js, id = 'facebook-jssdk', ref = d.getElementsByTagName('script')[0]; if (d.getElementById(id)) { return; } js = d.createElement('script'); js.id = id; js.async = true; js.onload = function () { if (typeof runFacebookLogin == 'function') { runFacebookLogin(); } if (typeof runFacebookRegistrationLogin == 'function') { runFacebookRegistrationLogin(); } };

js.src = "https://connect.facebook.net/en_GB/all.js"; ref.parentNode.insertBefore(js, ref); }(document)); }

fbq('init', '560747571485047');

fbq('track', 'PageView');

appendCarbon(); }

Read original article here

Adobe Photoshop’s ‘Super Resolution’ Made My Jaw Hit the Floor

Adobe just dropped its latest software updates via the Creative Cloud and among those updates is a new feature in Adobe Camera Raw (ACR) called “Super Resolution.” You can mark this day down as a major shift in the photo industry.

I have seen a bit of reporting out there on this topic from the likes of PetaPixel and Fstoppers, but other than that the ramifications of this new feature in ACR have not been widely promoted from what I can see. The new Super Resolution feature in ACR essentially upsizes the image by a factor of four using machine learning, i.e. Artificial Intelligence (AI).

The PetaPixel article on this new feature quoted Eric Chan from Adobe:

Super Resolution builds on a technology Adobe launched two years ago called Enhance Details, which uses machine learning to interpolate RAW files with a high degree of fidelity, which resulted in images with crisp details and fewer artifacts. The term ‘Super Resolution’ refers to the process of improving the quality of a photo by boosting its apparent resolution,” Chan explains. “Enlarging a photo often produces blurry details, but Super Resolution has an ace up its sleeve: an advanced machine learning model trained on millions of photos. Backed by this vast training set, Super Resolution can intelligently enlarge photos while maintaining clean edges and preserving important details.

What does this mean practically? Well, I immediately tested this out and was pretty shocked by the results. Though it might be hard to make out in the screenshot below, I took the surfing image shown below, which was captured a decade ago with a Nikon D700 — a 12MP camera — and ran the Super Resolution tool on it and the end result is a 48.2MP image that looks to be every bit as sharp (if not sharper) than the original image file. This means that I can now print that old 12MP image at significantly larger sizes than I ever could before.

What this also means is that anyone with a lower resolution camera, i.e. the current crop of 24MP cameras, can now output huge image files for prints or any other usage that requires a higher resolution image file. In the three or four images I have run through this new feature in Photoshop I have found the results to be astoundingly good.

Let’s run through how this works. First off, it works with any image file, whether it is a raw images file, a TIFF, or a JPEG. You will have to open the image file in Adobe Camera Raw via Photoshop or Adobe Bridge as shown below. To access the Super Resolution feature, right-click on the image and choose “Enhance” as shown below.

A dialog window will come up so you can see how the image will look and you can also toggle back and forth between the original image and the new Enhanced version. The dialog will give you an estimate on how long it will take to create the new Enhanced image, which will show up as a separate image file. Once you are ready simply click the Enhance button in the lower right-hand corner. ACR starts working in the background immediately to build the new image file and it eventually appears right next to the original file you selected wherever that one is stored.

In my testing, as shown below, it took this old 12MP image from 4256×2832 pixels to 8512×5664 pixels. The screenshots below show this enlargement. The top image is the lower resolution (original) version and the bottom image is the one that went through the Super Resolution process. The higher-res image looks absolutely amazing. And at 48MP I could easily blow this up to a 40×60 inch print just as with any image captured using my 45MP Nikon D850.

The Original image at 4256×2832 pixels shown at 100% in Adobe Photoshop. (Click to enlarge).
The new Enhanced image upsized using the Super Resolution feature at 8512×5664 pixels shown at 100% in Adobe Photoshop. (Click to enlarge).

Once I upsized the image using the Super Resolution feature, I zoomed into the resulting image and was very impressed. The image seemed just as sharp (if not a little sharper) as the original image file but of course it is massively larger (in terms of resolution and file size). Kudos to the folks at Adobe for creating a truly revolutionary addition to Photoshop. I have tried some of the Topaz AI software options, like Topaz Gigapixel AI, but I have not seen it work this well.

So what does this mean? For starters, it means that AI technology will have a huge impact on photography. Going forward, the software we use to work up our images (and upres them) might in some instances have a larger effect on the final images than the camera that was used to capture the image.

To a certain degree, this new tool in Photoshop significantly equalizes the playing field no matter what camera you are working with. All of a sudden my Nikon Z6 and Fujifilm X-Pro3 (respectively 24MP and 26MP cameras) are capable of producing stunning large prints in a way that was previously just not possible.

What about high-resolution cameras you may ask? Where do they end up with all of this? The new Super Resolution tool will allow up to upres any image as long as the resulting “Enhanced” image file is less than 65,000 pixels on the long side and under 500MP in total. What that means is I can upres the 102MP images from my Fujifilm GFX 100 and GFX 100S cameras and produce insane 400MP image files from a single image. That is getting into the absurd, but that also opens some doors for crazy huge prints.

The reality is that this feature is a huge boon to lower resolution (12MP to 16MP) and even medium resolution (24MP) camera owners. Higher resolution cameras will still yield better image quality but we now have the option of making large prints from relatively low-resolution image files.

Enhancing a Photo to 376 Megapixels

After talking with some photographer friends about this new feature I played around with images from a variety of different cameras to see how it varies. I ran a few images through from my Nikon Z6 and also a few from my Fujifilm GFX 100. With the GFX 100 image, the Super Resolution feature popped out a 376MP image file that was damn near identical to the original image file, just four times larger. My jaw hit the floor when I zoomed into 100% and compared it to the original! You can see both the original and the Enhanced images below. There is no way to actually convey the 100% image size here as I have no control over the viewer’s screen resolution but regardless, they both look wicked sharp.

The original Fujifilm GFX 100 image at 11205×8404 pixels shown at 100% in Adobe Photoshop. (Click to enlarge).
The new Enhanced image upsized using the Super Resolution feature at 22409×16807 pixels (376MP) shown at 100% in Adobe Photoshop. (Click to enlarge).

From what I can tell, the Super Resolution tool seems to do an even better job with higher resolution cameras and in particular with cameras that do not have an anti-aliasing filter in front of the sensor. My Nikon Z6 images when enhanced with this tool still look impressive but not as jaw-dropping as the example above. The Z6 has a very strong anti-aliasing filter, basically a filter that slightly blurs the image to reduce digital artifacts. In addition, it seems like the amount of sharpening or noise reduction applied to the image is also magnified so playing around with how the image is worked up may have a significant effect on the final image quality. I will have to do some more testing.

If you have gotten this far, and are still reading this full-on pixel-peeping madness, then you might have realized that this could be the best upgrade to any and every camera ever. This is certainly one of the most incredible features Adobe has ever released in Photoshop.

This is just the start of the AI revolution. It also shows quite clearly that many of the advancements in image quality are going to come from the software side of the equation as we start to see cameras with incredible specs that might be hard to dramatically improve upon in the coming years. I am super excited about this new option in Photoshop as it will allow me to offer much larger prints than I have been able to create previously–and they will look stunning.


About the author: Michael Clark is an internationally published outdoor photographer specializing in adventure sports, travel, and landscape photography. The opinions expressed in this article are solely those of the author. Clark contributes to National Geographic, National Geographic Adventure, Sports Illustrated, Outside, Men’s Journal, Backpacker, Outdoor Photographer, Digital Photo Pro, Climbing, Alpinist, Rock and Ice, Bike Magazine and The New York Times among many others. You can find more of Clark’s work on his website, Facebook, Twitter, and Instagram. This article was also published here.



Read original article here

Nvidia brings its latency-reducing tech Reflex to Overwatch

An update to Overwatch’s Public Test Region (PTR) is bringing Nvida’s latency-reduction tech, called Reflex, to the popular esports title (via Engadget). The tech aims to help reduce the amount of time between when you click your mouse, and when you see the resulting action on screen, making the game feel more responsive. The fact that it’s coming to Overwatch was announced back in January, but it’s now available to players who can access the PTR, and who have the latest Nvidia drivers.

If you haven’t been able to get your hands on one of Nvidia’s latest graphics cards, there’s still hope that you’ll be able to try Reflex out for yourself in Overwatch — the tech was announced alongside the 30-series graphics cards, but works on cards going back to the GTX 900-series.

Nvidia has an incredibly in-depth explainer on how the tech works, but the very surface-level overview is that the game will work with your GPU to make sure that frames are made “just-in-time” to be shown on your monitor, so you should theoretically always be seeing the latest information.

It’s worth noting that latency can have a few meanings, especially when it comes to online games. Reflex isn’t designed to help improve your network latency, so if you’ve got a bad internet connection it probably won’t help improve your gaming experience all that much.

Whether the difference in latency will be noticeable will depend a lot on the type of equipment you’re using, how much better it is, and how eagle-eyed you are. Still, if you’re one of the testers, it’s probably worth turning it on to try it out, and seeing if you notice the improvement. For everyone else, it’s something to look forward to trying out in a future update.

AMD also has a feature meant to reduce input latency on its graphics cards, called Radeon Anti-Lag, which can also be turned on for Overwatch.

Read original article here

You Can Now Play Zelda: Breath Of The Wild In First-Person Thanks To This Newly Discovered Glitch

Sure, we might be eagerly waiting for Nintendo to give us the next proper update about the sequel to The Legend of Zelda: Breath of the Wild, but there’s still plenty to discover in the original 2017 Wii U and Switch release.

Following on from a glitch that makes Link invincible and allows him to walk underwater, a “simple glitch” has now been uncovered – allowing you to play the open-world entry from a first-person perspective. This finding was made by Zelda fan and Twitter user Axk_000, transforming the title into something that looks more like a scene out of The Elder Scrolls: Skyrim.

Activating this glitch isn’t all that complicated and no modifications to the game or system are required. As you can see above, all you need to do is pull out the in-game camera while holding an item and then cancelling it. After this, your camera should be set to first-person. You’ll even be able to see Link’s shadow up close and walk about.

If you would like to try out this glitch yourself, we suggest you do it sooner rather than later before Nintendo potentially patches it out in a future game update. Is this the Breath of the Wild glitch you’ve been waiting for? Would you be up for a Zelda game from a first-person perspective? Share your thoughts down below.



Read original article here

Google Lens is pretty powerful, but how do you use it?

Google Lens is what you get when you feed the giant mountains of data from other Google services into a camera app. 

Point your phone at a product and Google Lens will find out what it is, and how much it costs online. Show it a landmark and Lens will identify it using the vast Google Images library and tie it to handy info like opening hours. 

If there’s text, Google Lens will use optical character recognition to identify it, and let you use it as a search term without typing it in yourself.

It’s been around since 2017, but rather than get quietly sidelined like so many other Google projects, Google Lens has slowly built up increasingly impressive powers – many of which aren’t that well known or understood.

So we’ve rounded up some of our favorite Google Lens tricks here to show you how it can save you time and bother, or instantly search for things that you come across in the real world. But first, here’s how to find it on Android and iPhone.

How to get Google Lens on Android

If you have a recent Android phone, a Lens mode may well already be built into the camera app. 

Look for the Google Lens icon (below), which is a couple of circles hemmed in by three sides of a square.

(Image credit: Future)

Nothing there? Just download the Google Lens app from the Play Store.

Using the Google Lens app is much like using a camera app. There’s a shutter button at one end, marked with search icon, because you don’t actually end up taking photos with Lens. 

Instead, the phone effectively freeze-frames the view, giving Google Lens a scene to analyze so you don’t have to keep the camera pointed at the right spot while Lens does its thing.

How to get Google Lens on iPhone

The Google Lens experience is slightly different for iPhone users. Rather than having a standalone app or being integrated into the camera, Lens is instead built into the official Google app – which you can download for free from the App Store.

From here, it’s simply a case of tapping that Google Lens icon in the search bar (to the left of the microphone), which you can see circled in red below.

Tap the icon circled in red to open Google Lens within the ‘Google’ app on iPhones. (Image credit: Future)

This will open up the camera viewfinder, which gives you access to translate, shopping, text search and food search options – essentially, almost anything you point it at, Lens will be able to search for.

There is another way to use Google Lens on iPhone – if you an open a photo in the Google Photos app, you’ll see the Lens icon on the bottom row of icons (second from the right). This lets you search for info on the objects or subjects in your Photos library – for example, a particularly tasty bottle of wine you logged for future reference.

Using Google Lens: the basics

When you hit the search button in Google Lens (on Android or iPhone), you’ll see blue dots in the image highlighting points of interest, and any recognized text will be covered by translucent block of white. 

Tap on these and Google Lens will bring up the relevant results. The app also lets you pick a category to pare down the kind of results you’ll see.

(Image credit: Future)

These vary between platform, but on Android you’ll see options like Dining, Places, Shopping, Homework, Search, Text and Translate. It’s the same on iPhone, only without the Homework option, which is a highly impressive shortcut to solving maths questions.

Here are 11 of the top things you can actually do with Google Lens right now.

The 11 best ways to use Google Lens

1. Scan barcodes

Google Lens has no problem with barcodes. After all, it’s effectively just a way to encode a number that identifies a product. 

It’s a dead easy way to look up stuff online without taking a picture of its front, or its name. That works sometimes, but there’s a consistency to the barcode approach we kind of like.

(Image credit: Future)

To scan a barcode, just point the Google Lens camera at a barcode (on iPhone, it’s best to choose the ‘Shopping’ tab at the bottom), wait for it to mark the code with a blue dot and, if needed, hit the shutter. It’ll then bring up the product name and link you to some online stores to buy it from. Handy.

2. Check restaurant reviews as you walk

Google Lens takes your location into account when harvesting results. Flick to the Places tab in the app, hold it up in front of a restaurant or shop front and the app will, in most cases, bring up its Google profile.

(Image credit: Google)

From here you can see how it’s rated according to Google’s reviews, and you can get a link to the place’s website – if it’s a restaurant you’ll likely get to see the menu, too.

Of course, the building doesn’t have to be a restaurant – you can also use Google Lens as a virtual tour guide, learning about local landmarks and getting spoon-fed handy information on their opening hours, historical facts and more.

3. Learn about that restaurant dish

Ever find yourself wondering what something on a menu actually is, but can’t quite manage to get the attention of a busy waiter or waitress? Google Lens is perfect for this.

(Image credit: Google)

Just tap on a scanned menu and Lens will show you both a description of the dish, and even recipes for it. This is handy if you want to know what likely goes into the restaurant’s own version, of if you fancy learning how to make it yourself at home.

Even better for the indecisive, Lens will also sometimes highlight the restaurant’s most popular dishes for you, allowing you to dig down further into reviews and real-world photos taken from Google Maps.

4. Check prices

We love a bargain, and Google Lens offers a great way to check if a shop’s sale prices are actually a good deal. Take an image of a product in the app’s Shopping tab and online shopping deals will be prioritized in the search results.

(Image credit: Google)

Try taking a shot of the item itself if that’s the sort of image used on the product page, and has unique identifiers, such as its name. Otherwise you might want to try shooting the box or its label. 

Google Lens may be smart, but it’s unlikely to be able to recognise, for example, a particular pair of jeans. In that case, scanning the tag would be the better option for more accurate results.

5. Search photos you’ve already taken

You don’t have to use the built-in camera with Google Lens – images from your gallery work just fine, too, and that means it works for something someone sent you over, say, WhatsApp.

On both Android and iPhone, press the little mountain icon by the shutter button to open up your photo gallery in the Google Lens app.

(Image credit: Future)

This is super-useful for images with text in them, as Google Lens will attempt to detect all text in the picture, which can then be searched and translated into different languages. Handy if you’re abroad and trying to pick up the language on the fly – or just simply finding out the name of a place you visited.

6. Live translate text

Translation is one of the most useful features of Google Lens. More than 100 languages are supported, because it feeds into Google’s longstanding Translate service.

Translating text it recognizes is neat, but Google Lens goes further. The translated text is mapped onto the image in augmented reality fashion. This is particularly handy for menus and signs, we find. But you can also use it for things like train tickets, like the below.

(Image credit: Google)

Of course, restaurant menus aren’t where this feature’s use begins and ends. We’ve all felt a little lost or overwhelmed when in a foreign country. In this case, Google Lens can help you out in a pinch, say, if you need directions, or to figure out the specialty of a store you’re interested in checking out.

Like this feature? Give the standalone Google Translate app a try too. It performs translations in real time, again in augmented reality, handy if you’re away from home and can’t read the local language.

7. Get help with your maths homework

The homework tab of Google Lens (which is currently Android-only) sounds like a way to cheat on your math homework, but it’s actually a lot smarter than that – and there’s more educational insight on offer here than you might think.

(Image credit: Google)

Sure, if you take a snap of a simple math calculation the links shown will include Google’s calculator and the solution, if it applies. But Google’s also offers ‘Key Concept’ information for algebraic equations, which tell you the basics of what’s going on in the problem posed.

Maths is a tricky subject to learn at the best of times, so with Google Lens being able to offer insight into how more advanced mathematical concepts work, we can imagine it being a handy helper for revision and preparing for exams.

8. Read out articles

Google Lens also makes great use of Google’s voice synthesis software. Using the Text tab, you can scan an article, a postcard or the back of a cereal packet, for example, and get Lens to read it out.

(Image credit: Google)

A Listen button will appear in the results whenever you scan text. We find this to be a great accessibility option and could be a huge boon for users with dyslexia, or even those looking to learn pronunciations of foreign words.

That’s right, the Listen feature also works with non-English text. We tried highlighting both French and German text from images we’d taken, and in both cases, the phrases were audibly pronounced with appropriate local voices.

9. Copy text or notes to your laptop

One feature of Google Lens is a pretty obvious application of the tech. You can copy scanned text to your phone’s clipboard, because of course you can – but the app goes one step further than that.

(Image credit: Future)

Google Lens also lets you copy that text to your PC or laptop. You just need to have the Chrome browser installed, and be logged into the same Google account you’re using on your phone. There’s a ‘Copy to Computer’ button for this feature, and it puts the text into your laptop/desktop’s clipboard.

This can be a handy shortcut if, for example, you find a section of text on your phone you’d like to refer to later, perhaps for an essay or research. That text can then be transferred over to your PC or laptop for future use.

Impressively, Google Lens can now also copy handwritten notes from your phone to your computer, as long as your handwriting is relatively neat. Just point the Lens camera at the notes, highlight it and hit ‘copy’ – you should then be able to go a doc in your Chrome browser and paste the text.

10. Learn about works of art

While the Google app can be used to identify songs, Google Lens is particularly good for identifying visual works like paintings and digital artwork.

This is a pretty simple, but handy, application of Google’s Image Search. You can just use the default ‘search’ tab for this one on Android and iPhone.

(Image credit: Future)

From there, you can search for similar images, the same image at different sizes, study the origins of the picture and, if it’s a digital piece of artwork, discover who drew it and find links to their websites and social media pages.

11. Identify plants and animals

The same Google Image search smarts can also be used to identify dogs, cats and types of plant. Once again it feeds into content Google already has in place. 

For example, when you search for ‘Jack Russell Terrier’ on Google, there’s a ready-made profile of the dog breed. It includes details like their life expectancy, average height and weight, and the common personality traits of the breed. By recognizing a kind of dog, cat or plant in an image, Lens can simply pull this stuff up instantly.

(Image credit: Google Lens)

We also used Google Lens to identify a eucalyptus plant, and were able to learn all about it when the picture we snapped took us to the relevant Google search results. Doing research on plants that might look nice around the house? This Google Lens feature is your best bet, with surprisingly accurate results.

Read original article here