Thanks Ethan W. for bringing this video up
Something strange, scary and sublime is happening to cameras, and it’s going to complicate everything you knew about pictures. Cameras are getting brains.
Until the past few years, just about all cameras — whether smartphones or point-and-shoots or CCTV surveillance — were like eyes disconnected from any intelligence.
They captured anything you put in front of them, but they didn’t understand a whit about what they were seeing. Even basic facts about the world eluded them. It’s crazy, for instance, that in 2018, your smartphone doesn’t automatically detect when you’ve taken naked pictures of yourself and offer to house them under an extra-special layer of security.
But all this is changing. There’s a new generation of cameras that understand what they see. They’re eyes connected to brains, machines that no longer just see what you put in front of them, but can act on it — creating intriguing and sometimes eerie possibilities.
At first, these cameras will promise to let us take better pictures, to capture moments that might not have been possible with every dumb camera that came before. That’s the pitch Google is making with Clips, a new camera that went on sale on Tuesday. It uses so-called machine learning to automatically take snapshots of people, pets and other things it finds interesting.
I had my IB students sign up for Hacktoberfest which is open to everyone in the global community!
The learning target was to learn how to participate in the global open source software development community.
- Seen here, the first student with a shirt awarded for making four pull requests between October 1–31 in any timezone. Pull requests can be to any public repo on GitHub. Pull requests reported by maintainers as spam or that are automated will be marked as invalid and won’t count towards the shirt.
A powerful statement about the kind of learner who can be successful in software engineering!