What you're seeing, and what's vital to understand, is that a solar panel will supply a certain current (at any voltage) - up to a certain point. That current is directly affected by the illumination available (the different W/m^2 curves - that's illumination power per square meter of panel area). At a certain voltage, the current starts to drop off, and eventually you hit the open circuit voltage (Voc) - the voltage the panel produces when there's no current draw. The peak power (maximum power point) on the panel comes slightly past the start of the drop in voltage, and the available power drops
rapidly as you go past that point into the voltage collapse. At both the short circuit point (0V, plenty of amps) and the open circuit voltage (0 amps, plenty of volts), the panels are producing zero usable power.
that demonstrates how this works for their particular example panel. The red curve is the current, and the blue curve is the power. The dot represents the maximum power point on both curves. Notice that the power curve to the right of the maximum power point is quite steep - it's not a gentle dropoff.
The curves change absolute values somewhat both with illumination and temperature. A colder panel will produce a higher voltage, which a good MPPT controller can extract as extra watts in the winter (when you really want all the watts you can get). Plus, there are curves over the standard 1000W/m^2 illumination you might see in certain conditions that lead to an awful lot of extra power. When might you see that? A vertical panel, with snow on the ground, on a bright, sunny winter day. Also, "cloud edge" effects (the edge of certain cloud formations can focus more light on your chunk of ground than full sun). In those conditions, a panel will produce more than rated current and voltage, and you'd better have designed for that! I've seen north of 11A from 9A panels in the winter reflection condition.
, right now, are producing 1.8A at 58V. In these conditions (afternoon shade, but a partly cloudy day), they'd happily provide 1.8A at 12V, 1.8A at 24V, 1.8A at 40V, 1.8A at 60V... right up until I pass the knee in the curve. Open circuit voltage today is 70V (my little PWM controller can tell me this), so peak power is probably right about 57-60V. And, if I were to try to pull more than 1.8A from them, the voltage would collapse. That's just what they can do right now, aimed as they are.
Swing them around to face the sun, and they're operating at 7.4A at 58V. These panels are connected with a PWM controller (pulse width modulation, or basically a switch that toggles quickly), so they always operate at whatever my battery bank voltage is. That means they're producing more watts when my bank is charging heavily (60V) than empty (48V). But that's the way I hooked them up because it's cheap and plenty good enough for my needs. Since their peak power comes fairly close to my battery bank voltage, the (small) gains of a MPPT controller don't justify the cost on this secondary array. But what's MPPT?
Look back at the diagram. The top of the power curve is called the "maximum power point" - for what should be obvious reasons. That particular voltage/current point is the absolute maximum number of watts you can get out of the panel at this particular point in time. A more sophisticated charge controller can track this point by sweeping across the range of voltage/current values and finding the maximum power. My main array
of 8 panels
is hooked up to a MPPT charge controller (a Midnite Classic 200, which runs around $600). If I load things up enough to get them at max power point, they're operating at about 116V/13.8A/1600W (two strings of 4 panels in series instead of one string of 2 like my morning panels). It's a good solar day. The MPPT controller converts that power into what my battery bank (and the rest of my system) wants - about 27A at 59V. This is the insides of a Midnite Classic 200, and it's a fairly complicated bit of circuitry (this unit can handle up to 4500W of panel on a 72V battery bank). This is only doing the maximum power point tracking and DC-DC conversion - it's not even outputting an AC waveform!
What happens when I don't need all that power (assuming the batteries are full)? If my AC compressor is turning, I need about 1.6kW. Shut that down, I'm pulling about 950W. Where does the excess power go? It simply doesn't get produced in the first place. A charge controller can restrict the energy drawn by drawing less current than the maximum power point, which lets the voltage float up towards the open circuit voltage (you could also draw more current, but that's a far less stable way to operate). When I'm pulling 950W, my main array is running at 133V/5.1A/678W, while my morning panels make up the rest (actually, they produce what they can, and the main array makes up the rest). The system only draws as much as is actively being used.
So, going back to the curve: If I try to draw more than whatever the peak power of my panels are (in current conditions), the voltage (and power) collapses. If I tried to pull 2A out of my morning panels when they were facing east and only able to source 1.3A, the voltage would collapse to 0V and the power would drop to zero. What if I try to pull 2A out of them when they're swung out and able to produce 7.4A? Well, I can pull 2A for as long as I want.
The key here is that you cannot
pull more than the maximum power from a panel - even by a little bit - without suffering a massive voltage and power collapse. You can operate below the maximum power point easily enough, but it's hard to identify the maximum power point without sweeping through the whole range to find it.
Microinverters Versus Charge Controllers/Off Grid Inverters
A typical grid tied solar system is built with microinverters. These are a combination MPPT tracker and inverter for each solar panel, normally in the 280-320W range, though that's creeping up with time as panel output increases. The output from these synchronizes with the grid - typically 120VAC and 60Hz, in the US. However, they're very simple devices. They don't have onboard frequency generation - they can only work when given a voltage waveform to synchronize against. They also only work at maximum power point - that's their whole point, and when the grid is up, they're connected to what is, from the perspective of a microinverter, an infinite sink. So they sit there, finding the maximum power point, and hammering amps out onto whatever waveform the grid is feeding them.
They also, because they're feeding the grid, have zero
surge capability. A 320W microinverter can never source more than 320W, which is fine, because the panel will generally not produce more than 320W. There are conditions where it can, but they're unlikely for roof mounted panels (a very cold, very clear winter day would seem like a case, but the panels aren't typically aligned to take advantage of low winter sun). When the inverter can't process everything the panel could produce, it's called "clipping," and it's really not that big a problem as long as it's not many hours a year.
But, because of these requirements for the operating environment, microinverters are significantly cheaper to build. They just need to be able to find max power point and shove that power onto an existing waveform.
An off grid system typically has two different devices - a charge controller (the Midnite Classic shown above) and an inverter (sometimes more than one of each in parallel). These are separate devices, and cost a good bit more than a microinverter of comparable power. But, they also work with the battery bank, and have to deal with more amps. A 320W microinverter will typically consume around 10A on the DC side and output about 2.5A on the AC side. My charge controller tops out around 75A on the battery side, and my inverter can pull 125A from the battery bank (peak current). I've got a massive low frequency inverter that weights about 40lb (for stationary use, I consider power density in inverters an anti-feature - I'd rather have a massive inverter than a tiny one, because they tend to last a lot longer). My inverter is rated at 2kW, but can source up to 6kW briefly if needed.
Some of the newer systems use a high voltage DC coupled setup - this is how the DC Powerwalls work (which was the Powerwall 1, and was advertised for the Powerwall 2, but then cancelled). For this, you have a very high voltage string of panels (typically 400VDC, either from panels in series or from power optimizers, which are basically a microinverter that outputs high voltage DC), the battery bank hangs on that bus, and the inverter swallows 400VDC and puts out AC. This works better for higher power systems, but it's not a very common off grid layout.
You need batteries in an off grid system for two reasons: Energy storage is the obvious reason, but they also cover peak power demands. Lots and lots of things in a typical home draw far, far more startup power than they do peak power. Anything with a motor is likely to do this, and compressors are particularly bad about this (fridges, freezers, air conditioners, etc). Pretty much any semi-inductive load is going to be a pain to start in terms of current requirements. Again, using data I have handy, my air conditioner pulls about 700W running, but it pulls somewhere around 2kW, very briefly, when starting. My system is designed for this sort of load (my inverter is a 2kW unit with a 6kW peak surge current capability), but you have to be able to handle that, or the system won't work. If you have purely resistive loads, there's still a startup surge - a typical bulb draws more current on starting as resistance goes up with temperature (you can radically extend the life of incandescent bulbs by putting a negative temperature coefficient resistor in series with them, and this was a popular trick with aircraft landing lights before LEDs got bright enough). This is another reason off grid inverters tend to be large and heavy - they have to be able to provide that peak power. Most off grid inverters have a peak power delivery of 2-3x their sustained power delivery, and mine is on the high end, peaking at 3x rated.
Worth noting on batteries: They suffer age related degradation as well as as cycle based degradation. You cannot keep any battery alive forever, even if you don't use it. Lead acid chemistries
(flooded, sealed, AGM, whatever) are rarely good past about 10 years, though if you were to keep them really cold you could probably manage it (some of the industrial cells are rated for 15 years, but they're quite a bit more expensive). Lithium... eh. It supposedly lasts longer, but I treat accelerated lifespan tests as a general guideline to compare batteries instead of full truth. I make a lot of money on dead lithium, and there's a lot of ways to kill them. They also require heating in the winter or you'll get lithium plating while charging (which is also a way to kill the capacity).
Let me offer a general guideline on batteries: Any time you put any sort of battery into a power system, the system will never "pay for itself." There may be specialty cases where this isn't true, but it's a solid first order approximation you should be aware of. Off grid power is insanely expensive.
Off Grid Without Batteries
Now, how does all of this relate to off grid use without batteries?
If you have a typical grid tied system (microinverters or normal string inverters, so easily 95+% of installed rooftop solar), the system is technically incapable
of running off grid (without additional hardware). There's no waveform to sync with, and the inverters cannot produce their own waveform. Also, they cannot operate at a reduced power output (this is more a side effect of the firmware, but it's true of the vast majority of ones out there). So they can't produce less power than the panels are creating at the moment, and they can't produce more. And they can't make a valid AC waveform out of it. You can see how this might be a problem.
If you want off grid capability from a microinverter system, you need what's called an "AC coupled system." This involves a battery bank (uh oh), and an inverter/charger that can suck power from the home's AC grid, as well as deliver it. You generally can't size this to use the whole roof, as a 10kW charger/inverter and a battery bank that can handle that sort of charge rate are really expensive. Basically, this system provides a waveform for the microinverters, sucks excess power, and eventually shuts the microinverters off (usually by pushing frequency out of spec for them). There theoretically exists a setup that can tell the microinverters to back off a bit, and with the newer UL specs, that should be easier with some of the improved ridethrough curves, but... it's complex, and nobody really does this. Generally, you only couple some of the solar panels to the AC coupled setup, because it makes a smaller charger/inverter possible. So you may AC couple 4kW of a 12kW system.
The only real way to get off grid power without batteries is to go with an inverter that has an emergency outlet. Some of the SMA inverters support this (they call it Secure Power Supply
) - you feed the whole rooftop array into them, and they can, if the sun is shining, provide 1.5kW or so to a dedicated outlet - assuming there's enough solar power. So, from an 8-10kW array, on a sunny day, you can get 1.5kW by operating well below the peak power point. If the array can't keep up with current demand (a cloud goes over), the outlet shuts down. It's better than nothing, but this is just about the only way you can get battery-free off grid power. To get any sort of stable battery-free power, you have to run the panels well, well below peak power (30-50% of peak is as high as you can really run), and even then, you have a horrifically unstable system. If the array power briefly drops below demand (perhaps an airplane has flown over), you shut down the entire output for a while. Hopefully your devices can handle intermittent power like this. If the array can source 1300W at the moment and a compressor tries to draw 1301W while starting, you collapse the array voltage and shut down the outlet. That's really hard on compressors (and everything else attached to the outlet).
If, as some nutjobs
prefer, you want sustained off grid running for most of the house, you can design a system with batteries that's intended for this sort of use. I plan to build this, eventually. I'll have 8-12kW of panels on the roof, feeding into a few charge controllers. These will feed into a moderately sized battery bank under my house, and will be coupled to a large inverter that supports grid tied production as well as standalone use (probably an Outback Radian 8kW unit). I'll have most of the house downstream of the inverter, so I can run everything I care about off the inverter - I'll lose some loads like the heat pump backup coils, possibly the stove, but the rest of the house will work, and I'll have enough surge capacity to do things like run the well pump and the air conditioner. I don't expect this system to ever "pay off" in financial terms, but I value stable, reliable power, and a test lab for this sort of operation.
Or you can separate your backup power from your solar, which I'll talk about a bit later.
So... hopefully that's a bit of a technical overview of how things work. I assure you, most of the furor over this is related to how systems are installed, not "Meanie Power Company Being Evilly Evil."
One term one will hear tossed about is the concept of "islanding." This refers to a chunk of the power grid (possibly a single house) that has power while the rest of the local grid is dead. It's common to hear "anti-islanding" blamed for why a home's solar can't produce power when the grid is dead. Lineworker safety is usually mentioned in the next sentence.
What this means, simply, is that a local generating system cannot (legally) feed into a dead section of power grid. For a home power system, this means that unless you have a specific mechanism for disconnecting the home from the power grid (typically called a "transfer switch"), you cannot power the local home circuits from solar or generator.
Now, that said, it's really less of an issue than it's made out to be. Backfeeding the power grid, according to some lineworkers I've talked to, is really not a big concern for two reasons. First, lineworkers assume lines are live until proven otherwise. And, second, no residential system is going to successfully backfeed a large dead section of grid. The grid without power looks an awful lot like a dead short, so the microinverters or string inverters or generator or whatever will instantly overload and shut down. It's in the regulations, but it's really not that big a concern from a technical/safety perspective.
But, if you haven't explicitly set your system up to support islanded operation with a transfer switch and battery, your solar won't power your house with the grid down.
It's Not Power Companies Being Evilly Evil - It's Homeowners Being CheapWhy have I written all this? To explain (hopefully) that the reason most solar power systems won't work off grid has literally nothing to do with power companies being evil and demanding that you buy their power. It has everything to do with the system not being designed to run off grid. Why are they designed that way? Because it's cheaper. Period. A microinverter based system is substantially cheaper than anything with batteries (which will need regular replacement), and that's what people get installed when they want a reasonably priced bit of rooftop solar to save money on their power bill.
If you want to get a rooftop solar system that powers your home with the grid down, you can do it! The hardware is out there. But such a system will be significantly more expensive than a normal grid tied system, and it will likely never "pay off" in terms of money saved. That's all.
So stop blaming the power companies for homeowners buying a grid tied system (because it's cheap) and then complaining when it won't run off grid. That's like complaining that a Mazda 3 won't tow a 20k lb trailer.
The Cheap Path to Backup Power
Now, if you want emergency backup power, and your goal isn't to spend a comically large sum of money on a system like I'm designing (the ROI on my system design is "never" if you don't value sustained off grid power use), the proper solution is a generator. I highly, highly suggest a propane (or natural gas, if you have that) generator - it's so much easier to store propane than gasoline without it going bad. Ten year old propane is fine. Ten year old gasoline is a stinky, gummy varnish. Says the guy with an extended run tank for his gasoline generator
A generator and transfer switch is the right option for almost everyone interested in running through a power outage. The solar feeds into the grid side of the transfer switch, the generator feeds into the house side. When the power goes out, flip the transfer switch, light the generator, and go. Or, if you get really fancy, you can get an automatic transfer switch that will even start the (expensive) generator for you!
This doesn't give you uninterrupted power (there's still a blip when the power sources change), but it's far, far cheaper than putting your house on a giant inverter and adding batteries.
Can't Microinverters Sync to a Generator?
If the microinverters need a waveform to sync with, couldn't you create that waveform with a generator or a tiny little inverter and have the rooftop units provide the rest of the power?
Unfortunately, no. A microinverter generally won't sync to a generator - and if it could, it wouldn't work anyway. Most fixed RPM generators (typically the cheaper open frame generators, running at 1800 or 3600 RPM) put out such amazingly terrible power quality that a microinverter will refuse to sync against them. Put one on a scope if you have one. They're bad. It's very nearly electronics abuse to run anything more complicated than a circular saw from them.
You can perhaps get the microinverters to sync against an inverter generator, but then where will the power go? Let's say you've got a 3kW Honda and an 8kW array - not an uncommon setup. You start the generator, the freezers and such start up, pull 1500W. The microinverters sync, and with the sun, start trying to dump 4-5kW onto the house power lines. The Honda will back off, since it looks like a load reduction, but you've got 4000W of microinverter output trying to feed 1500W of load, and the microinverters won't back off. What they'll do is drive voltage or frequency high, shut down, and then the Honda has to pick up the 1500W load instantly. And you'll do this over and over. If you don't destroy the generator, you'll probably destroy the loads. It just doesn't work.
The AC coupled systems solve this by saying, "Well, I'll just pull power into the battery bank until it's full." So they'll let 1500W of that output from the rooftop units feed the freezers, and pull the other 2500W into the battery bank, so things are stable. And then shut down the microinverters by pushing frequency out of spec when the batteries are full.
UL 1741 SA compliant inverters might
be able to be tricked into partial output, but I don't think that's likely to be very stable for long.
Small Scale Battery Backup
Another valid option for backup power (as pictured in the title picture) is some sort of small battery box (with or without solar). I built a 1kWh power toolbox
(with solar capability) last summer, and that can run at least some useful loads if we lose power. Goal Zero makes some nice equipment in this realm, if somewhat pricey. Though, really, my "power is out for a long while" plan involves my generator and some extension cords, for now.
Power Grid Stability and Rooftop Solar