REN: Reperceiving the Grid Part 1
2021.06.30 – Ian Page
The grid is normally described as a cloud that delivers electricity as needed to a wide array of sinks/applications/users at an averaged price in control and buys services such as generation and storage from a variety of external suppliers.
It has a very difficult job balancing the variable supply and variable demand with congestion on lines and keeping alternative lines to handle outages.
Supply and demand are being balanced by market systems such as day ahead bidding for 15-minute periods as well as using inertia and battery systems to balance the system within the 15-minute periods.
It's worth comparing this with the internet.
The internet also has the need to maintain an instantaneous balance between packets coming onto the system and going out, packets cannot pile up. Congestion appears as delays to streaming rather than brownouts or overvoltage.
The internet also has sources that can deliver huge quantities of packets unpredictably and sinks that can demand large volumes of packets unpredictably.
Both systems (I’ll just refer to both as clouds) have long distance high-capacity lines which have expensive interfacing devices, and shorter distance distribution lines (lower voltage for the grid, or headend cable of fiber connections of the internet)
The internet doesn't seem to have the same problems as the electricity grid. Partially this is because customers tolerate hiccoughs and slow downloads, whereas brownouts and 20 second blackouts would not be acceptable. So, there is a difference in quality if not in overall structure.
The internet cloud has some approaches that might have some equivalent in the electric cloud.
The first is buffering: most real time volume delivery keeps a ten second or so buffer to allow for variable delivery time. Could appliances have some similar capacitor-based buffer? Would it cost much in high volumes; how much could it eventually save?
Secondly is the akamai system that holds Netflix films in cache at the headend of the local distribution, rather than every request for a film being handled from the originating source across the long-distance backbone lines.
An analogy for this might be that in the customer facing node of the electricity distribution system there is some degree of storage so that short term demands such as boiling a kettle or microwave are averaged and supplied locally and the demand on the backbone is smoothed.
Thirdly much of the "internet " experience is actually handled on the end device. For example, writing emails is essentially offline. Increasingly Netflix and Amazon allow films to be downloaded onto devices for later watching. Much of what happens on a web page is actually done locally.
Finally, the nearest equivalent to the electricity 15-minute bidding system is the web page ad bidding system. When you request a web page, a mess of data about your preferences and habits, as well as your most recent browsing history is sent out to hundreds of ad brokers. They bid for the ad space on the page. The winners send you their ads. (Which my ad blocker then deletes before I see it). This all happens in a few milliseconds. It feels as though bidding for 15-minute generation slots 24 hours in advance is a bit old fashioned.
The nearest equivalent to electricity is Zooming. Here there is minimal buffering (a few milliseconds) and any problems are immediately visible. All data goes over the backbone system, and it's probably the most expensive resource used on the internet per bit.
Notably the internet doesn't have a single moment by moment management function
The primary point of this is that there are different uses of the internet cloud with very different strategies for delivering acceptable performance, yet to a large extent the whole of the electric grid is regarded as equivalent to the Zoom application i.e., instantaneous using the long-distance resource.
This suggests that there might be some mileage in trying to see the clouds as more similar than different. (The water cloud I've written about previously is another example and is closer to the internet than the grid.)
We might start by framing the electric grid more like the internet, not as something that is in charge and requesting services from others to broker and provide a universal quality and performance, but as a set of services offered to its peripheral sources and sinks. Here a source is a peripheral that mainly delivers electricity overall, and a sink is one that mainly consumes electricity overall. Stores are both sources and sinks at different times but overall are sinks due to storage losses.
This allows different grades of drives and different services to be disambiguated.
These services will carry variable charges depending on QOS required. For example, a periphery might require a guaranteed 1 MW 24x7 which could be expensive. Another might want a guaranteed 1Kw at a fixed price with an option for up to a maximum 4KW at a different price. Another might want 4 Kwh over 24 hours but be very accommodating as to when and at what rate. These are not so silly if you look at mobile phone service charging schemes which seem to be exploring this space!
Delivery services would have similar options- guaranteed acceptance of 10 GW 24x7 would carry one value, various types of variable delivery would carry different prices both fixed and variable depending on the deal.
An aspect of the service delivery would be whether the service uses the backbone or various layers of the distribution system.
If a source and sink have an agreed contract that passes only over the local distribution system, it should be cheaper than one that sends it over a thousand miles of the backbone. This would encourage sources and sinks to engage in specific contracts rather than just assuming that the electricity comes from somewhere. We see some elements of that in the UK where you can buy a "green " electricity supply though the locality issue is not defined.
There will naturally be an argument that a centralized aggregated system providing smoothed costs is cheaper or simpler but given the number of models for how grids absorb green electricity with their different solutions it may be that trying to maintain an existing concept may not be the cheapest nor a viable way forward.
The grid can also offer cloud storage services- if a source has too much electricity at some time it has a choice to invest in its own storage locally or put it into the cloud for later retrieval. The grid, or even an external storage supplier, could then charge for amount stored and time stored- somewhat like a bank. With the usual bank options of storing specifically like gold, storing on a current float basis, or not storing at all and relaying on spot market surplus to satisfy eventual retrieval.
This appears at first to be quite a rethink of the grid. The result of unbundling its services is that some will become cheaper, some will become more expensive since they are currently subsidized by cheaper services, some will be novel, and the overwhelming need to constantly balance the network will be reduced to balancing some services over some scales, but not others.
One additional feature is that the grid’s distribution and transmission systems and even the wiring in houses were designed for much lower demands than after the great electrification. Let's assume that a UK house goes from an average of 1 kw to 4 kw, nationwide. This will involve rewiring a lot of the older stock of houses, digging up the roads, feeding ground for much larger transformers in housing estates, quadrupling the number of pylons and wires across the country, and underground. This is all very expensive and takes a very long time. By focusing the grid on a more limited set of goals and services it may be possible to avoid much of this.
Comments
Post a Comment