Discover more from Securing Our Future
Securing Our Future - Trust and Digital Authenticity
27 July 2022 - A Weekly Publication by New North Ventures, formerly the Dual-Use Round Up
Trust and Digital Authenticity
We live in a world where everything is going digital. The COVID-19 pandemic forced many interactions onto the internet that would otherwise have been in person. This drove online presence through chat and video where companies, individuals, and organizations could host meetings, events, and more. While this innovation has been great for productivity, it comes with a cost. Operating in the digital world means we must rely on our digital identities to interact with other people’s digital identities. We are all living amongst “digital people.”
Early digital computers used switches and jumpers to load user data directly into a processor. These were upgraded to interactive input devices like a keyboard, monitor, and mouse. While these devices made the computer experience more immersive for the user, personal data was still mainly held in physical databases and the idea of the digital self was limited.
Enter high-definition image capture, high-speed networks, and artificial intelligence (AI), and the internet became populated with digital personas. Users could upload sound, text, and images through computer sensors, microphones, and cameras. Digital computers would then use this vocal, written, and image data to recreate natural language processing and more. Now we live in an age of digital interaction where a computer can use personal facial expressions and voice input to imitate tonal recognition, emotional understanding, and context clues. Our computers can even memorize personal preferences and data to tailor online interactions for individual users.
This presents a major issue with the trust and verification of people online. When our computers can act on our behalf it becomes increasingly difficult to recognize the difference between real human interactions and manipulated images and sounds. There are authentication and identity verification processes online, such as two-factor procedures or biometric identifiers. These act like border guards of the internet making sure users present their proper papers before entry. While these processes have helped keep people safe online, there is still a lack of transparency once a verified user transfers their data to an online application that can then act on their behalf through generated automation. With this lack of transparency, it can be disconcerting for a person to wonder if they are interacting with the real John Doe or the digital John.
In the 1950s, Alan Turing, a British computer scientist, created the Turing Test which was used to evaluate a machine’s ability to mimic behavior identical to a human. The Turing Test has been used repeatedly in AI testing, and it appears that our computers are getting closer and closer to acting like human beings. A prize called the Loebner Prize was run for nearly 30 years to have computers and humans interact and test whether a human could identify whom they were speaking with. There is an entire genre of art called generative art which has computers creating art, a seemingly complex task for a computer to generate. Some of the generated pieces are of significant quality, and the results are improving every year.
This leaves us with uncomfortable challenges. When we interact online with someone over a video chat or talk to someone through text chat, how can we know that they are in fact that person? What can we do when verified people can also use a digital self? What happens if a system says we are not real, what do we do then? As generated experiences are becoming more prevalent, how do we know the differences between a digital persona versus an augmented version of that persona versus something entirely generated?
The recent uptick in deep fakes leaves society uncertain, lacking trust in news sources that they previously relied on for transparency and accurate information. Innovation in this space is crucial and urgent. Luckily we are seeing more operators developing cutting-edge technology to detect deep fakes. The use of artificial intelligence is a key component to solving this problem, which is why we invested in Reality Defender, an AI-based deep fake detection platform.
Combining smart algorithms and a large repository of known authentic and manipulated photos provides a durable advantage to surfacing pictures that have been manipulated. And while being able to tell an authenticated image is a good first step to creating more transparency online, we are still a long way from safety and trust.
In addition to the deep fake detection technology that Reality Defender provides, the Federal Judiciary will need to find the right legislative balance to protect targets of deep fakes while not conflicting with the First Amendment. Finally, heightening public awareness and education is key as technology alone won’t combat the spread of disinformation.
Jeremy, @NewNorthVC Co-Founder
Key People: Founder & CEO Aatish Mandelecha
Elevator Pitch: Data security platform intended to solve the identity theft problem by going to the source of it
Funding: Raised $3.5 M of seed funding on July 19, 2022 in a round led by Fuse
Elevator Pitch: Air mobility platform intended to build an aerial system for future hardware or software integrations
Funding: Joined Techstars as a part of the summer class on July 18, 2022 and received $20 K in funding
Key People: Founder & CEO Matt George
Elevator Pitch: Autonomous flight technology designed for fixed-wing aircraft
Funding: Raised $105 M in Series B funding on July 13, 2022 in a round led by Snowpoint Ventures
Focused on early stage and growth tech companies to bolster national security
Planning to invest 60-70% of its capital into Series A and B rounds and 20-25% into seed and growth stage companies
Raising $250 M to support companies that focus on future mobility, space, sustainability, digital enterprise applications, networks and security
Boeing agreed to invest $50 M in AEI Horizon X venture capital fund
DoD allocated $100 M to 10 startups in hopes of accelerating existing innovative tech pilots into full production
Trying to bridge the gap from proof-of-concept to scaling the technology
Dark matter refers to variations in valuations of software companies that revenue and profitability can’t explain
Valuations skews have a “fat-tail” where most multiples are unremarkable with a few standouts skewing the data