Reach out with magic ideas!

Email
[email protected]

 

AI Personality Cloning

AI personality clone face glitch art

AI Personality Cloning

New research shows that AI can now mimic a person’s personality with surprising accuracy after just a two-hour interview. This breakthrough could reshape how we study human behavior but also raises concerns about misuse, such as creating deepfakes. Could the future of creativity involve digital replicas, not only assisting us but making decisions alongside us?

 

Simulation agents are designed to accurately simulate human behaviour across a range of social, political, and informational contexts. In a study conducted by researchers from Stanford and Google DeepMind, 1,052 participants were interviewed for two hours, covering topics like their personal lives to their views on different contemporary issues. The recordings of their responses were then used to train a custom AI model for each person. To test how accurate these AI agents would behave, both the participants and their digital counterparts were asked to complete a series of tasks, including personality tests and games. After two weeks, participants were asked to repeat their answers. The AI agents replicated these responses with 85% accuracy. A similar level of precision was achieved when the agents predicted personality traits across five separate social science experiments.

For social science, digital human replicas could offer new possibilities in conducting large scale studies or for example explore sensitive topics or scenarios that would be ethically problematic to study with real human participants. Joon Sung Park, the Stanford PhD student who led the research takes it a step further. He believes that these simulation agents could one day allow individuals to have “a bunch of small ‘yous’ running around and actually making the decisions that you would have made.

 

Opportunities and risks

While it remains a speculative scenario, creating digital copies and AI assistants based on ourselves and our personalities is a prospect that’s rapidly moved from science fiction to scientific possibility. This shift could extend beyond mere assistance and optimization, enabling a new paradigm for intelligent agents. These digital twins can help us analyze our communication styles, simulate different scenarios, and even spark creativity. However, there’s a darker side to this technology. Malicious actors could use these replicas to impersonate us, potentially leading to identity theft and fraud. As our digital footprints grow, so does the risk of our identities being compromised. To protect ourselves, it’s crucial to be mindful of our data privacy. By understanding how our data is collected, stored, and used, we can take steps to minimize the risks.

 

Can I create simpler digital replicas myself?

LLM:s like ChatGPT and Google’s LaMDA are capable of generating text in a variety of styles. It’s possible to fine-tune these models on a dataset of your own writing or speech to create an custom AI model. This is an example of such an approach where you are creating a dataset built on e.g. your social media posts, emails, or notes. This could then be used to train a language model to generate new content that mirrors your tone of voice. You can also play with video tools, simulating the experience of having a real conversation over video using training data of a specific person. Tools like Eleven Labs offer the opportunity to clone your voice to be integrated with other services.

 

 

 

Further reading: Future Proof Your Creativity