The extra power usage is destroying our planet all for someone to try and make a quick buck, and causing shortages of GPUs during a pandemic where mental health from entertainment like gaming is valuable and stops people from going outside and getting/spreading the virus. The incentives for crypto all are f'ed up right now. The ETH PoS switch can't come soon enough.
I think it's quite fair to say earning income is less important than allowing people to achieve happiness and video games are a pretty well known way to accomplish that.
I unequivocally disagree with this statement - "happiness" achieved by video games is superficial, without real meaning. They are thrilling endeavors and fill time effectively, but they are absolutely secondary to the absolute positive of actually deriving income. I seriously can't believe that anyone would seriously argue a distraction is worth more than actual income.
What does mining crytprocurrency have to do with earning income? The two are completely mutually exclusive, in fact mining is a great example of making money without doing any useful work
>Air pollution accounts for 1 in 8 deaths worldwide - approximately 7 million deaths in 2012, according to new data from the World Health Organization (WHO). The findings, released in late March, doubled previous estimates from just a few years ago in 2008. WHO now characterizes air pollution as “the world’s largest single environmental health risk.”
Mining crytprocurrency isn't "earning" income. Doing work is warning income. Crypto miners are about as useful as landlords - they shouldn't exist and don't deserve any money
And I'd actually like to see a study on the positive mental health benefit of gaming, because I'm skeptical.
Also the idea that you need a primo GPU in order to game is a bit silly. And if you're the kind of person that absolutely needs the best graphics in order to enjoy gaming, please see my point no. 2 about mental health.
The issue isn't the "we're using renewable energy - see, this doesn't make things worse." It also doesn't make things better. Whats more, its increasing the consumption of energy which is at the core of the problem.
It would have been even better to push that renewable energy out onto the grid (and also not increase the consumption of power for crypto mining).
Switching all new power consumption to renewable doesn't improve things because the baseline of non-renewable is still there. We need to reduce existing and switch existing to renewable.
I suspect a bit of Parkinson's law is in place with energy. https://en.wikipedia.org/wiki/Parkinson%27s_law -- The key is to stop making it worse (by mining crypto and trying to justify it with "but its from renewable").
I mine on renewables and with 3000 series gpus. I didn’t want to do any of this, I ended up here because my energy retailer decided to tax me for exporting too much energy to the grid. So short of earthing the electricity, my solution was to burn it up in mining, and improve my solar roi.
The 3000 series gpus are the most efficient hashes/watt. So actually by disabling this, you are driving people towards less efficient/worse for the environment ASICS machines.
Yes I batteried my home, and use my solar on everything I can first... there is still excess... especially at peak.
Pretty sure this dismantled most of your argument.
You are the exception of all exceptions. What percentage of GPUs being purchased for the purposes of crypto are for reasons like yours vs being bought by already wealthy conglomerates and mining groups to further enrich the new crypto multi-millionaire/billionaire class.
Aren’t wealthy just buying giant TH asic machines?
The gpus are good exactly because they are efficient but if you didn’t care about power consumption, ASICS are more profitable (in terms of cost of setup vs return from mining).
The primary reason I went with gpus, is because I can shut them off based on solar excess in 100w increments. So if I’m 200w in excess, turn 2 cards on. 300w - 3 cards etc.
They're not. Asics are more a function of the coin being mined. Namely, if asics are out and competitive, they generally replace gpus altogether as they're both more powerful and efficient. See btc, where gpu mining is completely worthless. Eth on the other hand has no notable asic out AFAIK, and requires gpu mining. Eth miners, regardless of scale, thus require gpus.
This is a bit of a two wrongs don't make a right: the fact that your energy retailer is taxing you doesn't (by itself) make it 'right' to use that energy for mining.
Ofcourse I totally understand you do as you do, on an individual basis, but that just stresses the point that this can only be regulated at the group/population level.
Not really. I’m not sure I even believe the “grids weren’t designed for feed in” bs. Wires don’t care where the electrons come from, and they have no problem with me drawing the same load I feed in... so something doesn’t add up. But I’m no electrician... just cry a little at 7-10kw excess that i either have to pay to feed in or earth.
I don’t see anything I’m doing as a wrong. Literally just wasted energy... why not convert it to $?
> In the old days, when power companies were the only generators in town, control devices like regulators, capacitors or relays were designed to assume that power flowed in only one direction. “If they saw power flowing in the other direction, they typically tripped or misbehaved, causing customer outages or power quality issues,” Kuloor says. This tendency has prompted many utilities to update such equipment with reverse-power-flow logic or, in the case of mechanical devices that have no software, replace devices altogether.
Not all local grids are prepared for that.
Additionally, there's the signaling of power supply that generators do between each other.
The different parts of the grid will use fluctuations in the frequency of the grid to indicate if they've got too much power, or can't supply enough.
This can get more complicated with single house putting power back in that they're matching the frequency, but not controlling it as such. If there is too much supply from single house sources, it can cause the frequency of the grid to vary too much and potentially damage the power plant generators.
When this becomes exaggerated - all the houses start producing more power because the sun came out (or the opposite that it clouds over), this can increase the costs for power.
You've also got the Hawaii problem. The Hawaii grid doesn't have the luxury of shipping its power to the next state over.
There are times when the rooftop solar in Hawaii exceeded the total demand for power. This meant there was too much power in the wires and that generators needed to shut down (or get into problems with pollution and inefficient combustion).
> The grid can only accept as much power as the island is consuming. Juario must mix and match different sized generators to balance what solar rooftops are producing while ensuring that the generators have enough “spinning reserve”—room to throttle up and down to handle those grid surprises. The Maui Electric chief operator must also keep the generators running hot to prevent inefficient combustion from sending dirtier exhaust up the stacks and violating the air quality rules that protect residents’ health.
This all feels like a buffering problem. Can power companies not just buy big batteries... you wouldn't even need that many, just enough to signal generators to do the right thing.
(Also no idea all the gadgets in-between house and generators, but assuming all the gadgets can handle load going in and out)
And there are other approaches to power storage. Thermal power storage is popular / useful with large solar installations. Hydro dams associated with a reservoir often have pumped storage where they can pump water back into the reservoir and then use it when its needed.
The signaling and frequency matching is still an issue.
I recall a company I worked at in California had two sets of generators. They had a diesel emergency generator for the data center and also a set of natural gas generators to cut down on power costs (when the price went high). The issue with the natural gas ones is that they needed something else to provide the utility frequency, something about them not being stable/consistent on their own.
> An electrical power system containing a 10% contribution from PV stations would require a 2.5% increase in load frequency control (LFC) capacity over a conventional system an issue which may be countered by using synchronverters in the DC/AC-circuit of the PV system. The break-even cost for PV power generation was in 1996 found to be relatively high for contribution levels of less than 10%. While higher proportions of PV power generation give lower break-even costs, economic and LFC considerations impose an upper limit of about 10% on PV contributions to the overall power systems.
The key point to this is that if solar power is increased, then the grid as a whole needs to also increase its power to get back in control of the load frequency. That is likely what you're seeing. By itself, this storage isn't sufficient for answering that issue. Ideally, the solar systems would have their own local batteries and respond as part of a smart grid to contribute according to the load frequency. ... But that costs more money for the installation and consumers are hesitant to do that.
Very true. If you're pushing power out onto the grid, you're no longer a consumer but rather a producer and there are other issues that come into play.
Twenty, thirty years ago the amount of power from rooftop installations being pushed out onto the grid was minimal compared to the size of the grid and it wasn't an issue.
If you had an energy producer that was producing 10% of the power and doing significant swings in its production without participating on the wholesale energy market (and not signaling those swings)... the regulators would have shut them down.
But when you've got 10,000 people each contributing 0.001% of the grid power (and all having the same swings)... and not participating on the wholesale market and not signaling their swings, that can be disruptive (and damaging).
No where near. But if you factor the tax I was paying (I.e assuming I didn’t earth it). It will reach cost neutral in a year and be returning margin after that.
They are increasing this tax as well shortly... so it will get better.
If one's motivation for using green energy is to make the environment not go to crap as fast, then using green energy to mine crypto doesn't help anything and the use of green energy is nothing but a fig leaf.
If one wants to mine crypto, then do it as cheaply as possible instead. It really doesn't matter what the source of energy is. The entire system of crypto mining is inherently about who can use the energy the cheapest. Whats more, if someone gets energy cheaper it forces others to consume more power too.
If crypto is mined, it really doesn't matter what it uses power it - mining crypto is still increased consumption of power. Someone else is going to have to use non-green energy to do whatever they're going to do.
Energy production is not zero-sum with some fixed amount of non-renewables.
If I install solar on my house, I'm not forcing anyone else to use non-renewables. If I then use that solar to mine crypto, I'm... still not forcing anyone to use more non-renewables.
Oversimplifications like this are a fantastic way to avoid the core issues. Harvesting said ball of energy isn't free. Solar tech and near future power infra is certainly not at the maturity level for this, especially when the consumer is a power and semiconductor black hole of demand.