12px13px15px17px
Date:05/02/18

Aurga uses AI to run your camera

Not everyone who owns a DSLR is a highly-skilled photographer, yet they do all want their photos to look good. Well, that's why Aurga was created. It's a plug-in module that uses artificial intelligence to adjust the camera's settings, based on a database of professionally-taken photographs.
 
Similar in concept to the successfully-crowdfunded Arsenal, Aurga slots into the camera's flash mount, and is wirelessly controlled from up to 100 ft (30 m) via an iOS/Android app. Utilizing that app, users can choose between six photo categories, based on the type of shot they're taking – these include things like Portrait, Landscape and Moving Object.
 
Once a category has been selected, Aurga searches through its database of thousands of photos, analyzing ones that fall into that same category. Taking the existing ambient lighting into account, it then adjusts the camera's settings, based on what was done in those other photos.
 
If the user is taking a portrait, for example, Aurga will open up the aperture to decrease the depth of field. Should they want light trails in their night-time shots, on the other hand, it will select a slow shutter speed.
 
The app can also be used for full manual control of the camera, to trigger the shutter remotely, and to get HDR (High Dynamic Range) shots – in the case of the latter, three identically-composed shots are taken at different exposures, and then merged into one composite photo.
 
Additionally, the module has a memory card slot for backing up photos, supporting up to 256GB. It should be noted, however, that Aurga only works with Nikon and Canon cameras. There's a list of compatible models at the link below.
 
Aurga is currently the subject of a Kickstarter campaign, where a pledge of US$89 is required to get one. Assuming it reaches production, the retail price will be $130.




Views: 287

©ictnews.az. All rights reserved.

Facebook Google Favorites.Live BobrDobr Delicious Twitter Propeller Diigo Yahoo Memori MoeMesto






29 April 2024

28 04 2024