Please be assured that we uphold the highest standards of privacy and confidentiality; under no circumstances will your personal information be shared with third parties or used for unsolicited communications.
We are in the early stages of developing our new website. Please be patient with us! Sign up for updates
Website in beta. Please bear with us! Sign up for updates
Right AI Laws, to Right Our Future
Join the Global Grassroots United Front
#WeStakeOutAI #HaltDeepfakes
Unchecked advanced AI threatens our jobs, our safety, even our existence as human beings.
82% of U.S. adults agree: it's time to slow down and deliberate. Learn more about our Global Grassroots United Front, advocating for right artificial intelligence laws for safer AI.
What is AI Deepfake Technology?
Top Deepfakes Examples, Detection Tactics & Insights to Avoid Getting Tricked or Scammed
Date: Thu, Feb 22, 2024
Time: 6:00 pm EST
Congressman Ted Lieu, representing California's 36th district & recognized as one of TIME's 100 Most Influential People in AI wrote: "I'm a Congressman who codes, and AI freaks me out."
Date: Thu, Feb 29, 2024
Time: 3:00 pm EST
A report by Goldman Sachs says AI could replace an equivalent of 300 million full-time jobs. Multiple professions have already lost their income overnight because of AI. That's years of schooling, years of experience in the industry, years of hard work replaced in the blink of an eye.
During 2018-2021 alone (when deepfake technologies were still mediocre), there was an approximate tally of $36 million scammed (counting only high-profile deepfake attacks). These types of scams have increased in frequency and in amount due to improved AI without regulations.
The development of superintelligent AI poses human-extinction risks if such AI is not aligned with human values or if it becomes uncontrollable. This could lead to scenarios where AI systems make decisions detrimental to human survival.
Help us stake out AI #WeStakeOutAI. Together, the Safer AI Global Grassroots United Front (SAGGUF) can influence governments around the world to pass the right regulations, and policies to govern AI, making AI safer for humanity.
Presented at the first ever International AI Safety Summit (held in the UK in 2023), the ‘scorecard’ we researched is listed on page 3 of The Future of Life Institute's proposal
Comment on the use of copyrighted works to train AI models, the appropriate levels of transparency with respect to the use of copyrighted works, and the legal status of AI-generated outputs
Co-hosted a grassroots educational event for SAG-AFTRA members (not an official SAG-AFTRA event) with actor and singer Melissa Medína titled: "How to Prevent AI from stealing your Job"
Frankly, our team alone cannot do much. We are small potatoes. We can do our best to present the evidence, but alone against the giants, we don't stand a chance.
That's why YOU are so important.
The only way we can affect change is with your help. History shows that there is power when people band together, and we strive to be the grassroots support that good policymakers and politicians can rely on to affect change. Your contribution, whether big or small, will help us mobilize and inform everyone else about the crucial AI risks that are harming humanity now and will do so in the future.
United as one, we can ensure AI development is safer for us, for our children, for our children's children, and for humanity as a whole.
CONTENT
Start Here
THE Crucial AI Risk
AI Risk Categories
Other AI Topics
Are You a...?
FAQ
TAKE ACTION
[S] Sign the AI Safety Petitions for Safe AI Laws
[T] Tell Your Story
[A] Advising Tailored to You
[K] Kudos & Donor Recognition
[E] Endorse & Testify
[O] Offer & Volunteer
[U] Unleash Your Influence
[T] Thankful Community
Please contact us to suggest ideas, improvements, or corrections.
We do our best to provide useful information, but how you use the information is up to you. We don’t take responsibility for any loss that results from the use of information on the site. Please consult our full legal disclaimer and privacy policy. See also our cookie notice.
© 2023-2024 – all rights reserved. Please contact us if you wish to redistribute, translate, or adapt this work.