Create Don't Scrape - Tumblr Posts

1 year ago

I'm 80% sire this isn't actually photography and actually just AI generated. The first beds blankets is unrealistically folded and the headboard looks to have classic Ai scribbling. The second image has 5 rafters in 1 corner.

If all of these are real photos good on the photographer for making such amazing edits but it's unlikely

friendlybageldemon - Friendly_Bagel_Demon
friendlybageldemon - Friendly_Bagel_Demon
friendlybageldemon - Friendly_Bagel_Demon
friendlybageldemon - Friendly_Bagel_Demon

Tags :
1 year ago

The amount of "aesthetic" blogs on here I have seen been using Ai "photography" and don't mention it. This ones better at hiding it but the Ai is clear to see with the books and the writing. Please know how to identify Ai and support actual photographers.

friendlybageldemon - Friendly_Bagel_Demon
friendlybageldemon - Friendly_Bagel_Demon
friendlybageldemon - Friendly_Bagel_Demon

㋡🥀


Tags :
10 months ago

This shit is Ai. Not photography just tag that pls for the love of God. This shit ain't travel cause these pictures aren't real

friendlybageldemon - Friendly_Bagel_Demon
friendlybageldemon - Friendly_Bagel_Demon
friendlybageldemon - Friendly_Bagel_Demon
friendlybageldemon - Friendly_Bagel_Demon

Tags :
1 year ago

My experience with glaze, before and after glazing (glaze is a protection layer to protect against ai stealing your work)

Here's a drawing before being glaze'd™, click read more to the the drawing after being glazed

illustration of a dog stabbed by a knife, the blood dripping is blue

nah, i tricked you, that IS the image after being glazed, if you didn't notice the effect, it may not look apparent to your art as well (although do note that my art uses a lot of textures so the protection layer is not very visible, and i used the default intensity setting) this social media for artists website cara (dot) app, let's you glaze your images FOR FREE on their own servers, meaning you don't need a powerful computer for it i've just needed to create an account and i could upload a file right away, it takes a few hours if that's a dealbraker for you


Tags :
1 year ago

reminder that all art ive been posting lately has been glazed to protect it against ai theft, if you didnt notice that means its working wonders.

you can go to cara (dot) app, a artstation alternative site, and send artwork to let them glaze it for you, wait time is generally 1-2 hours, that's barely an inconvenience, you even recieve an email so you don't even need to keep checking.

mobile UI of the cara dot app website, there is a box in the middle prompting the user to either glaze an image and post it (labeled "new post") or just glaze without posting (labeled  "choose file...")

you can do this on your phone!!!!!!!!!!!!! :OOO


Tags :
1 year ago
homosexualfiend - HomosexualFiend

@staff

OUR CONTENT SHOULD BE OPTED OUT OF AI TRAINING BY DEFAULT!


Tags :
1 year ago

as an artist myself, seeing that list of artists who got their work stolen is so saddening. we don't need ai generators like that. pay real artists for their hard work. ↓read this tweet↓

Midjourney developers caught discussing laundering, and creating a database of Artists (who have been dehumanized to styles) to train Midjourney off of. This has been submitted into evidence for the lawsuit. Prompt engineers, your “skills” are not yourshttps://t.co/wAhsNjt5Kz pic.twitter.com/EBvySMQC0P

— Jon Lam #CreateDontScrape (@JonLamArt) December 31, 2023

Tags :
1 year ago
A New Tool Lets Artists Add Invisible Changes To The Pixels In Their Art Before They Upload It Online

A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways. 

The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission. Using it to “poison” this training data could damage future iterations of image-generating AI models, such as DALL-E, Midjourney, and Stable Diffusion, by rendering some of their outputs useless—dogs become cats, cars become cows, and so forth. MIT Technology Review got an exclusive preview of the research, which has been submitted for peer review at computer security conference Usenix.   

AI companies such as OpenAI, Meta, Google, and Stability AI are facing a slew of lawsuits from artists who claim that their copyrighted material and personal information was scraped without consent or compensation. Ben Zhao, a professor at the University of Chicago, who led the team that created Nightshade, says the hope is that it will help tip the power balance back from AI companies towards artists, by creating a powerful deterrent against disrespecting artists’ copyright and intellectual property. Meta, Google, Stability AI, and OpenAI did not respond to MIT Technology Review’s request for comment on how they might respond. 

Zhao’s team also developed Glaze, a tool that allows artists to “mask” their own personal style to prevent it from being scraped by AI companies. It works in a similar way to Nightshade: by changing the pixels of images in subtle ways that are invisible to the human eye but manipulate machine-learning models to interpret the image as something different from what it actually shows. 

Continue reading article here


Tags :