As you may have noticed from a few images that I posted in different threads, I have been playing around with an AI text-to-photo program that I have. Well, I asked it to depict Joe Biden molesting a 12-year-old girl, expecting something along the lines of what we have seen in actual photos over the past years, and most of them were like that - AI images of Joe Biden groping a fully-clothing child, as he is known to do whenever he gets a chance. After it produces one image, the program will generate additional images with the same text directions when you click on it, so I was pumping out a few of them. Now and then, there would be an image that reminds me of the Philadelphia Experiment, where people were embedded into walls, while, others would depict Biden with an extra arm or ten fingers on each hand, and a lot of them displayed contorted features, but many of them looked pretty realistic, while others were as realistic as an actual photo, and this was from a free AI application. Then, as I was clicking through it, it came out with one in which the 12-year-old girl wasn't clothed, and, before abandoning that project, I became aware that the AI program would produce realistic pornography if you wanted it to. I think I'll be a little more careful about what I tell it to do because I have no interest in going there, but that got me wondering about the implications that such artificial intelligence technology is likely to have on the enforcement of laws against child pornography, and what the could mean. Except that it's now in the hands of anyone with a computer, this isn't a new concern. Way back in 2002, the US Supreme Court ruled that virtual child pornography was not illegal because no actual children were involved. Back then, of course, very few people had the resources to create virtual child pornography. Now, it's a free download from the Apple App Store. What are the implications? On the positive side, pornographers are less likely to risk possible jail time that could come with using actual children to create pornography, given that an AI program can create unlimited versions of whatever they were looking for, and at any age - even babies. This could mean that fewer actual children will be victimized as subjects for the child pornography market. Then again, why would you pay a pornographer for something you can create for yourself through an AI program? Sadly enough, child pornographers aren't going to give up their trade and find an honest job. Instead, they will be transitioning into whatever people can't generate for themselves through an AI program. Even worse, there will probably be a larger demand for that since perverts are likely to soon grow bored with the stuff they can generate through an AI program and go looking for something more. Many of them do anyhow, but I think it's likely that the numbers will be greater since people are now able to access child pornography without having to access illegal websites on the dark web or do business with people even shadier than themselves. How many people are going to be introduced to pornography through AI imagery who may not have otherwise done so? Whatever an individual's inclinations, a lot of people just aren't brave enough to access the dark web or technologically savvy enough to even figure out how. But you don't have to be either brave or technologically savvy in order to tell an AI program what you want it to display on your computer screen. What is this likely to mean for the future?
OMG Ken you better save this post incase you ever get arrested or something. And our future what ever is left of it - huge problems I fear. Lucky me i will be dust in the wind and long gone. Think i would have picked a different project other than anything to do with porno stuff .
The photos the AI displays are determined by the AI according to the words that are typed into the box and none of those words involved pornography or naked people. Typing the same words into a search engine wouldn't come up with nude photos. That was my point, or at least a part of it. Perhaps I didn't explain this well enough, or it wasn't read carefully. I wasn't looking for this stuff and, while 90% of what the AI gave me was along the lines of what I was looking for, which was to see how easy it would be for someone to manufacture scandalous (but not pornographic) photos about someone through AI. For Biden, I guess that was too easy because there are enough real photos of him fondling little girls. But in none that I know of were any of these girls undressed. Anyone playing around with the AI program is likely to receive stuff that he wasn't looking for sooner or later, and those who are looking for it are going to get more than they ever dreamed. Therein lies the problem. It is a problem, I fear, because it opens this garbage up to people who wouldn't otherwise come across it even if they were looking, thus moving the bar while, at the same time, complicating things for the enforcement of laws against child pornography.
I do not know enough about a situation in order to agree or disagree. Slimey IMO as Biden is just looking at him, I have not seen any real pictures of him fondling any child. The one or two on the net state the photos were altered. There was a photo of him trying to kiss or hug a baby the mother was holding, the baby did not like that. So where are these photos, i tried to find them and could not. Bottom line, hard believe anything good or bad in today's world.Everything seems to be altered in favor attention getting . But yeah, AI will continue to cause issues for us all.
Without pretending to understand AI in any meaningful way, I think that what it does in the renditions it gave me of Biden groping little girls was little more than using actual photos on the Internet and manipulating them to change the hair color, features, expressions, and so on, to create a new image. That's why I said it probably didn't have to work too hard to render them. Removing the clothing was a surprise, and I didn't go any further with that. However, using AI to create other, not-so-pornographic, images, I can see that it could be used to portray pretty much anything. Given that it's AI, if you were looking for something in particular, you'd have to scrap several of them because it sometimes gives people extra arms or fingers, and it might bend a joint or stretch an appendage far beyond the capabilities of an actual human body. I can't remember which it was now, but I can remember looking at a photo that was published in mainstream publications that seemed to portray an extra arm on someone. I didn't think of it at the time, but it may have been AI, since it's been around for a while, just not so accessible. Considering that this is the free version, I would guess that the professional versions would be able to produce an image that would require special equipment to debunk, and then we'd be left depending on the "experts" to tell us whether it's real or not. If there's an interest, we can discuss the capabilities of AI elsewhere (there may already be a thread), but I'd like to keep this one on the subject of its likely deleterious effects on society and the efficacy of child pornography laws. As mentioned, the SC decided that images of virtual children were not punishable as child pornography because no actual children were used, but should that really be the deciding factor? Child pornography harms the depicted children foremost, but it also harms those who partake of it. While we might reasonably have no sympathy for those who partake of child pornography, people who view child pornography on a regular basis, like pedophiles, become addicted to it and are likely to want more, such as a real child. So, in the end, I don't believe that children are left unharmed by it, real or virtual. Might other people, who wouldn't otherwise have been inclined to seek or find such images, be drawn into it out of curiosity, due to the increased availability and the lack of repercussions provided by artificial intelligence? I fear so. Then again, enforcement becomes difficult either way.
I guess everyone is afraid to discuss this topic, which is probably why there's so much of it going on, given that no one wants to look in that direction. As for the AI program, I deleted it - not for that reason, not for that reason but because I was spending too much time with it - again, not for that reason but because it is really fascinating in what it is capable of doing, as well as what it does or does not understand, and challenge of learning to describe something in a way that artificial intelligence can understand. For example, describing someone from a photo (or someone you know) in words that the AI can render into a photo. The AI can easily and effectively duplicate human expressions when the right words are used to describe them. Interesting, but it was taking. up far too much of my time. It's not like it has a thing for displaying nude people. The word "molesting" got it thinking that way, I suppose, although that wasn't its first inclination. Otherwise, it will assume that people are clothed unless indicated otherwise. If asked to display a person and that person's characteristics are described, it will create an image based on what it understands from the description, placing that person in random clothing. Not exactly random though. If you specify a date or a place, it will usually clothe that person appropriately for the climate and other characteristics. However, if you describe a person's characteristics and then ask it to render that person in a blue shirt, it is apt to believe that a blue shirt is all that this person is supposed to be wearing. So if you want to avoid nudity, it's best to let the AI choose the clothing or specify every item of clothing. And be careful of words that have more than one meaning.
I was going to post something in Religion, but heck with that, if people don't believe, it's none of my business, it's God's buz and He is in control. I've been listening to a Pastor that has researched the AI and has quite a bit to say on the subject, not least of which, it's just one sign of the return of Christ. One pastor said "some people want to turn the Normal, into abnormal, including the Nuclear Family. They hate what we christians believe, obviously. They want to play God, but sadly, some are worshipped as gods, like those leading the way for changing children's sex. You cannot no matter how many surgeries, or drugs your given, you cannot change the sex you were created to be. God made them man and woman, period! The AI stuff can be good if it's used as at least part of what Elon Musk seems to be hoping for. Little chips (not unlike a pacemaker) that can help those with MS, or Parkinsons for example. Elon Musk and the head of Google, can't think of his name off-hand, were great buddies, until Elon Musk was said to disagree with Google-man. I can't remember it all so I'll post the video I watched from a pretty good pastor in my opinion. Take it or leave it, but I can't sign off without saying what was said to me years ago. They told me "what if you commit your life to believing on Christ", I knew I'd live a better life which is what I was searching for, and have something to live for, being in heaven for eternity. Then it turns out it was a lie, I'd just be dead and in the dirt, no thoughts, dead brain, no life at all. Then they said "what if it's all true, and heaven awaits those that believe?" Here's the Pastors video where he discusses mostly AI. Start at 2:00 minutes on video to skip his "subscribing" info.