I've been considering using Reddit data to pre-seed the content in a successor to Reddit. Though I am unsure how that would stand legally.
As a side note, I created an alternative Reddit API[1] and Reddit didn't like that so much they banned my 13 year old Reddit account.
Their eventual replacement will have to employ similar shenanigans, right? It's basically tradition among social media unicorns.
Except now, of course, one can build MUCH more interesting tools for pre-seeding communities. Should be fun!
They are doing the same now.... Reddit will end the same way as it begun
I dunno, as someone generally disillusioned by tech and its shady practices/ethics, this doesn't seem all that bad to me. It's a little gross but maybe a necessary evil in this day and age.
> Huffman explains the strategy in a video for Udacity, an online education service.
The Udacity web development course was my introduction to coding. It blew my mind. He was a good teacher.
So does every social network.
It's a solid technique. I did the exact same thing for a news site I started with comments on articles and the forum.
No one wants to hang around a 'dead' site, but post enough activity, post enough 'controversial' comments and people will want to join in, and then suddenly you have an actual ecosystem of real people driving traffic.
I don't see a problem at all. You need to bootstrap a UGC platform somehow.
Price Club #1 on Morena Blvd., San Diego, asked employees, friends, and family to park cars in the lot. Seems to work. Of course the financials weren't reporting average daily cars in the lot either.
This was something that was (mostly) praised in the start-up community up until now.
I guess every big-boy internet entity is losing a layer of shine in 2023.
This was well known
Dating apps do the same. It's kinda easy to spot a bot. I wonder what GPT-powered bots will do on such apps.
Well, dating sites and other sites that needed to bootstrap to critical mass used to do it all the time. YouTube filled their site with copyrighted clips.
With AI, it would be far easier to bootstrap a plausible-looking site full of “active users”, and unscrupulous startups might do that.
It is why we built this, which at least is the most ethical approach we could come up with, as a service for any community to encourage discussion on ghost-town topics, while clearly disclosing its bots:
In about the broader debate about generative AI, this seems to be one of the least harmful (it has responsible disclosures) and most helpful (https://xkcd.com/810/) approaches
https://news.ycombinator.com/item?id=35779455
PS: feel free to use it if you own a Discourse forum, we would love to hear feedback
And all these years later the site is overrun with botters and fake profiles
Fake it till you make it!
[flagged]
Many small startups generate fake traffic, either for testing or for marketing purposes (e.g. cheating on the numbers).
I would be surprised to find any successful company that had no shenanigans in their origin story.
My history included a startup that shipped empty boxes to meet numbers, scraped thousands of emails from more popular websites to sell as their own traffic, even one that forged stock certificates to secure funding. (The FBI ended that one)