They don't even have to think of a way around any of it. They just need to search online to find someone who has found a way around it. Or prompt AI to come up with a solution. And even if someone is so dull that they don't think about that, it will probably make it to them as the info spreads by word of mouth.
They'll definitely be required to either add the attribution or stop using the code.
There can be punitive fines for copyright violation, moreso if the copyright is registered. I think there's some leeway there for the court.
There also may be damages. In the case of, for instance, illegal distribution of a Disney movie, Disney may be entitled to the amount of sales they supposedly lost.
It makes me think that open-source projects should routinely offer their product for sale, without the attribution requirements. Then, if another company violates their license, they have a tangible dollar figure they can point to and say exactly how much revenue was stolen.
But there's no method or structure in place to pay a website a fraction of a cent. Ads are the only way we've found that actually implements a form of microtransactions... paying a tenth of a penny for a sliver of attention.
I don't want to defend ads, but whatever replaces them is going to be very disruptive. Maybe better, but very different.
In 2023 I did a deep dive into the crypto community with two main questions:
- do these people understand the principles of making good products?
- is anyone clearly working towards a microtransaction system that could replace advertising and subscription models?
After attending two conferences, hundreds of conversations and hours spent researching, my conclusion to both questions was no. The community felt more like an ouroboros. It was disappointing.
I don't want to pay NYT a subscription fee, I want to pay them some fraction of a cent per paragraph of article that I load in. Same goes for seconds of video on YouTube, etc.
Apparently I'm alone in this vision, or at least very rare...
I have also done similar research because I wanted to build something to handle microtransactions on a personal website that could scale if adopted to be usable by everyone if they wanted.
I looked at crypto currency because it seems like the obvious naive solution. it doesnt work. the cost of the transaction itself far outweighs the value of the transaction when dealing with fractions of a cent. you want an entire network to be updating ledgers with ~millions of records per ~$1000 moved. the fundamental tech of crypto leans towards slower, higher value transactions than high volume, small transactions. Lots of efforts have been made with some coins to bring down the bar of "high value, low volume" to meet everyday consumer usage rates and values - but a transaction history at the scale of every ad impression for every person is a tough ask and would perpetually be in an uphill battle against energy costs.
Ultimately, the conclusion I came to is that the service would need to be centralized, and likely treated as cash by not keeping track of history. Centralized company creates "web credits", user spends $5 for 10,000 credits, these credits are consumed when they visit websites. Websites collect a few credits from each user, and cash out with the centralized company. The issue is that since it would cost more to track and store all the transactions than the value of the transactions themselves, you have to fully trust the company to properly manage the balances.
I started building it and since I would be handling, exchanging, and storing real currency - it seemed subject to a lot of regulations. It is like a combination bank and casino.
i've thought about finishing the project and using disclaimers that buying credits legally owes the user nothing, and collecting credits legally owes the websites nothing, and operating on a trust system - but any smart person would see the potential for a rug pull on that and i figured there would not be much interest.
The alternative route of adhering to all the banking regulations to get the proper insurances needed to make the commitments necessary to users and websites to guarantee exchange between credits and $ seemed like too much for 1 person to take on as a side project for free
It would need to be mostly centralized, but keeping track of history would not be hard.
A typical credit is getting paid in, transacted once, and cashed out. And a transaction with a user ID, destination ID, and timestamp only needs 16 bytes to store. So if you want to track every hundredth of a penny individually, then processing a million dollars generates 0.16 terabytes of data. You want to keep that around for five years? Okay, that's around $100 in cost. If you're taking a 1% fee then the storage cost is 1% of your fee.
If your credits are worth 1/20th of a penny, and you store history for 18 months, then that drops the amount of data 17x.
(And any criticisms of these numbers based on database overhead get countered by the fact that you would not store a 10 credit transaction as 10 separate database entries.)
fair enough on tracking history in the centralized model. I had suspicions there would be hidden costs that might make it too expensive. i dont think the data storage would be as much of a problem as the cost to write it to storage.
I wasn't fully envisioning credits only being transacted once before cashout either. I was thinking more along the lines of being able to create something that goes viral, a lot of people use it and you rack up a bunch of credits, and then you can sit on those credits and spend them as you use the internet without ever having to connect to a bank yourself. So people who are contributing more than they are consuming would rack up credits. they could use those credits to enrich their contributions, maybe pay for cloud services, etc.
the credits could form its own mini web economy if it got popular enough. As cool as this would all be if done honestly, I know that if i saw a company telling me to buy web credits to use anywhere on the internet and the websites get to decide how much to charge and they charge it automatically when i visit the website, and if the company i buy the credits from goes out of business then i may not be able to cash out or get my money back, then I likely wouldnt be buying those credits... so idk
Even with user to user credits it would take a lot for the number of transactions to go above 2. That would mean more than half the money is going to viral payouts.
And was this assuming you'd only take a cut on the cash going in and out? Because even a 0.1% cut of the transactions would mean you have $1000 to handle the amount of data I described in the last comment.
>And was this assuming you'd only take a cut on the cash going in and out
I think fee needs to be per transaction, maybe not cash flowed per transaction but accrued per transaction.
Say we both self-host a website for our favorite daily game, and I use yours about as much as you use mine. We would transfer roughly the same amount of credits back and forth to each other ad-infinitum. but the credit service provider is accumulating only expenses with each transaction.
Say someone make a lot of bot accounts to simulate user traffic, and it sends each of them credits to use to visit their own site. the host collects the credits from the bots and transfers them back to the bots to keep them running.
you are not alone, people seriously proposed one thing after another in the early 2000s.. same time frame as RSS, roughly. Somehow, these proposals were undermined and slow-walked? merger and acquisition in Silicon Valley was aligned with very different things
>"Ads are the only way we've found that actually implements a form of microtransactions... paying a tenth of a penny for a sliver of attention."
Ads were the path of least resistance, and once entrenched, they effectively prevented any alternative from emerging. Now that we've seen how advertising scales, and how it's ruined our mediascape, we're finally looking at alternatives. Not dissimilar to how we reacted to pollution, once we saw it at scale.
Given that the go-to linear-algebra libraries for the past N decades (BLAS, Linpack, etc.) are Fortran, I'd suspect that neural-network people would be rather okay with it, esp. if it could be driven with a Python wrapper (which is how most people use BLAS and Linpack today).
BASIC is roughly to Fortran what Rust is to C++: its creators set out to design a "better Fortran", and realized that the limitations and complexities necessitated creating a whole new language.
> Along with the time sharing system came the new language which they decided to call BASIC. At first it was going to be a subset of Fortran but they decided that no subset of any existing language would be complete enough.
> Kemeny and Kurtz realized that if they wanted to reach everyone on campus with their time-sharing vision, they needed to simplify the user interface. The popular programming languages at the time, FORTRAN and ALGOL, were "just too complicated," Kurtz recalled. "They were full of punctuation rules, the need for which was not completely obvious and therefore people werenʼt going to remember."
The truth is not as strong as I had claimed. BASIC's expressions kinda resemble Fortran's, probably because that was what was lying around. It seems that an easier version of an existing language is what Kurtz wanted, but Kemeny was more interested in starting from scratch, which view Kurtz came around to. From Wikipedia (https://en.wikipedia.org/wiki/Dartmouth_BASIC):
When the topic of a simple language began to be considered seriously, Kemeny immediately suggested writing a new one. Kurtz was more interested in a cut-down version of FORTRAN or ALGOL.[14] But these languages had so many idiosyncrasies that Kurtz came to agree with Kemeny:
If we had corrected FORTRAN's ugly features, we would not have FORTRAN anymore. I reluctantly had to agree with John that, yes, a new language was needed.[15]
I voted you up because you're correct, in that the only solution is construction and there are people that are doing everything in their power to avoid that truism.
But I don't think it is a left/right issue. In certain regions it may be the left, in others the right, but generally it is subset of both that have investment in artificial scarcity. It's just the justifications that change depending on ideology.
I'm very progressive in some ways but I do think progressives make this particular problem worse, often with good intentions. Both sides are equally NIMBY but liberals also have:
- More environmental regulations that can be used by NIMBY.
- Attempts to solve the using various forms of rent control, which make it worse.
- Related: conservatives favor less regulation, which leads to more construction.
- Liberals hate for (or at least distrust of) landlords. Some of this is well-deserved but I've seen liberals oppose good policies because it will "help landlords".
Lastly, home owners--including liberals--like to see the value of their property go up and tend to favor policies that make it so. It would be nice if we could get people to stop looking at their homes as financial investments.
And despite saying that they're the party of YIMBY, in practice we can clearly see that Democrats simply aren't. They'll say that they allowed ADUs, but then Dallas will come along and build 10,000 homes in the time it took Seattle to simply debate ADUs.
At some point you have to look at the actual results of policy.
See NIMBYs all down the west coast. I bet 90% of city dwelling homeowners would identify as “democrat or further left”, but are very conservative with the character of their neighborhood.
In my experience the bulders and tradesmen who are more right-wing have more to gain from allowing more and faster construction and are more interested in removing laws and restrictions.
A lot of this comes from the attitude in the 60s and 70s where the liberal strategy was to sue the government to stop them from destroying the environment. People from that era saw the smog and the flammable rivers and are generally against development , even though today’s development processes are starkly different from back then.
Right, it's not an inherently left vs right thing. Today NIMBYism has been largely a left-wing phenomenon, with really high end housing developments that are politically untouchable by housing projects.
The answer is always the same tho: make it easy to build housing, and build more housing. Keep building housing until there's a glut of supply.
Could any of the IT professionals, here, think of a way around it? Then likely the kids could.
reply