Somehow it seems strange that Japan, the motherland of robots, fails to build a robot.
Money can't be the reason. There has been an investment surge into radioactive cleanup technology starting with the damaged reactors six years ago.
Six years and billions of dollars later we get a robot that gets stuck and fails after just two hours. From what I've gathered they didn't even include a proper radiation sensor but used the camera noise in order to "guesstimate" the dosage after the fact. Did they not expect high radiation in that location?
Radiation-hardening electronics is not simple, and this robot was rated for 1000Sv cumulative, which is something of an accomplishment even if it wasn't enough for the environment here.
The article also absurdly understates what an activity level like 650Sv/h actually means. It "could kill a person quickly", the article says, as though the question were open - the truth of the matter is that such heavy irradiation will kill an unprotected human in a matter of seconds, a few minutes at most. As for protected humans, good luck wearing thick enough and dense enough armor to make a meaningful difference under that kind of bombardment - and if you do manage to find it and wear it, good luck not suffocating or being crushed outright under its weight.
This should, I hope, put in some more accurate perspective the fact that the robot lasted a couple of hours. It doesn't sound like much, and isn't - but it's vastly better than anything else we've got.
It should also, I hope, put paid to a sibling commenter's rather hideous suggestion of using Chernobyl-style "bio-robot" "liquidators" to deal with the aftermath. The only way that might possibly work would be by insulating the active slag under a thick layer of quite dead and half-cooked human bodies - a very Soviet suggestion, perhaps, but nothing to be seriously suggested by anyone who values even the pretense of civilized humanity.
Anybody that isn't familiar with the "liquidators" should watch Chernobyl 3828, a Ukrainian documentary about the cleanup of the worst part of the roof with comments by the people currently working their two-minute shifts as a "bio-robot".
Electronics of the shelve get very unreliable in high dosage environments.
The radiation rewrites your loaded programs, by flipping bits and causing rising edges where there shouldn't be some.
So the program running on your camera will dissolve above some radiation levels, locking it in a watchdog detects and reboot cycle. There are approaches to circumvent that in low to medium environment- one is to have redundant systems and select the "majority" vote on computation results.
But at this level, the only solution is to not have computation near the source and put up with long cable/fibre cable delay- which is tough with complex sensors (aka computers) like cameras.
It is probably some snail-eye setup- the camera a remote mini-drone on cable, retracted on black out.
In my day-job, I (help) design electronics that is subjected to (much less) radiation, and we also irradiate our devices to check for long-term effects (and this is done with a comparable doserate in a facility for this purpose).
Damage to your electronics in practice is almost exclusively caused by Photons (convention is to call this γ-Radiation when caused by radioactive decay in the nucleus, X-Rays when created in an accelerator). β-radiation (fast electrons) is easily shielded by thin layers of metal, and α (He nuclei) can't penetrate sheets of paper.
Individual photons can't really deposit much energy at a single location in your semiconductor, so they aren't able to generate enough charge to "flip bits" instantly. Flash still uses a lot of charge/bit, so it's relatively stable, DRAM is constantly refreshed and SRAM would need a jolt of high current to flip, so that does not really happen, either.
What radiation does, however, is to slowly damage the Silicon and change its crystal structure (introducing defects) which increases leakage and moves the analog threshold voltages of circuits around in a funny way. So, what we often see is flash becoming un-programmable on a more global scale (rather than individual stuck bits), and most importantly the analog aspects (voltage references, brownout-protection circuits, voltage regulators) cease to function.
This is all very variable, but we normally observe effects starting at "a few 100 Gy" when testing more complicated modules. We don't research individual components, though.
One especially nasty type or radiation are neutrons, though. When being "moderated" (slowly decelerated by successive interactions with material) they tend to have a high likelihood of merging with some other nucleus (being captured), and the energy resulting from this capture effect can be huge and concentrated on a single spot. This can be enough to flip bits, and this indeed may be an issue near the funny isotope mixture present in Fokushima at various places. Having a strong neutron emitter is very uncommon, though.
Unfortunately these damages are not specific to digital electronics, and especially optical components (parent refers to long fibre cables) tend to be rather sensitive due to their large structures, so "keeping the computers out" may not help much overall.
I dimly remember reading about a chemical treatment to the material, that would react with the ionized silicon until the reaction was depleted.
I do not remember the substance- and it was ridiculous expensive (in addition to the ridiculous expenses of custom made chips)
So, simple question: why isn't everything electronic shielded inside big lead blocks?
I get why this isn't the prevailing approach in aerospace applications: weight. But for a ground based robot with an external power source, why not heavy shielding cubes with minimal connections to the necessary exposed bits?
And why not just load it down with 5x CotS sacrificial cameras, then expose them as needed when the previous one dies?
For typical Gamma radiation of about 1MeV photon energy or higher, you need about 1cm thickness of lead to reduce the radiation to about a third (by a factor of 1/ℯ). One order of magnitude of radiation hardness (reduce radiation to 1/10th) needs two centimeters of lead. That's getting heavy pretty fast.
Radiation intensity inside your box depends on the thickness t: I(t) = I₀·exp(-tρμ)
t: thickness, say 1 cm
ρ: density of the material, for Lead 11 g/cm³
μ: absorption coeffcient 0.1 cm²/g [see ref 1]
I₀: intensity outside of the box
...import numpy as np...
In [5]: np.exp(-0.1 * 11.34)
Out[5]: 0.32174370422037013
In this case there is zero need for electronics on the robot. You can have a pure mechanical system with a fiber optic camera etc like a endoscope. Power as torc through a plumbers snake. It can then either carry a box with some sensor package, or take remote samples. For extreme radiation, you can put a Film badge dosimeter in a water tub to work out the radiation levels indirectly.
It means, that the robots have to be very "custom" build ones. With the intelligent parts beeing deposited outside of the dangerous area, communicating with the muscle via e.g. fibreoptics.
The robot consists ideally of some die-hard sensors positioned as far away as possible and as close as necessary - which are retractable mechanically in case of software failure by left behind elements. Then there is the actual tooling, which is basically just a remote controlled shell. A complicated setup, to say the least.
Imagine a Anemone with some little remote camera fishes on a tentacle, controlling a robotic crab on the longest tentacle of them all.
Ideally you'd want it to be gamma-transparent as well; otherwise, those gamma photons which hit it are going to deposit their energy into it. This is how glass viewing windows in nuclear materials processing facilities, for example, gradually cloud and embrittle over time, and have to be replaced. I'm not sure how well a relatively thin optical fiber would stand up, especially under the kind of bombardment we're talking about here.
You can stop them by slowing them down. Usually you'd use ellastic collisions with light atoms like hydrogen. Water works which is what reactors use. For shipping polyethylene is often used. Then some materials simply have a higher cross section for absorbing the slow ones. Had to look this up I saw boron and cadmium.
(Married to person who used to ship Californium-252)
You can use humans to clean the mess, whose massively redundant computational model is much more robust to radiation damage. They will however suffer or die from radiation exposure long term.
This is what was done at Chernobyl in admittedly much more dire situation. A quantity of Japanese remote-controlled robots were urgently imported back then as well, and failed in the same manner.
Not this mess, or even that one, really. The "bio-robots", so called by their masters, were mainly collecting debris from the reactor explosion for containment; they were not trying to deal with the core slag sunk into the basement of the building, which at that time was still thermally very hot as well as being ferociously active. Even to approach that would have been immediately lethal.
Given the cited dose rate, the same is true at Fukushima - an acute dose of 10Sv is very likely lethal on its own, and at 650Sv/h you get more than that every minute. Anyone you send into that is going to be unable to work within seconds, and dead inside a couple of minutes tops.
That's good to know, thanks. On the other hand, I find sources calling that dose rate lethal within minutes, and at least some of the Fukushima corium is roughly seven times as active. Even if you're willing to write off as many lives as it takes to reduce what is essentially solidified lava by means of hand tools into a containable form, I doubt it's likely to succeed simply because it'll kill off your workers so fast that, after a little while, no further progress is possible because of all the corpses blocking the way.
Well I mentioned it just as an example of humans being better at it than robots. I don't really imply that sending people to death is a good idea.
We should also distinguish between site cleanup (which is certainly impossible) and secure containment (which is possible and was done before). The idea is not to chip away bits of highly radioactive substance, but find a safe way around it to isolate the world from further contamination. Apparently, not everyone is convinced Japan has a plan there.
Realize that the ostensible safety of nuclear energy is based on believing that black swan environmental and political events are impossible. Walk away from it and move heavily/quickly toward renewables.
There are 187000000000000 million gallons of water in the Pacific Ocean. The plant "leaks" (they're intentional) are not affecting Hawaii; they're not even affecting Japan's coast.
"In April 2014, studies confirmed the presence of radioactive tuna off the coasts of the pacific U.S.[228] Researchers carried out tests on 26 albacore tuna caught prior to the 2011 power plant disaster and those caught after. However, the amount of radioactivity is less than that found naturally in a single banana.[229][230]
As of June 2016, dispersed nuclear fallout and associated radiation contamination continue to pollute the environment. Tilman Ruff, a professor at the University of Melbourne, stated that every day 300 tons of contaminated water leak from the crippled nuclear plant.[231] The Reconstruction Agency states that 174,000 people have been unable to return to their homes. Ecological diversity has decreased and malformations have been found in trees, birds, and mammals.[231]"
300 tons is 80,000 gallons. The ocean is 187,000,000,000,000,000,000 gallons. And, it's not "less than a banana"; it's less than one twentieth of a banana.
We carpet bombed the oceans with nuclear weapons in the 1950s, which created tritium levels we can still track regionally today. Nobody thinks it's a great thing that Fukushima happened, but the water thing is probably not in the top 5 concerns.
It's still in danger mode, still leaking massive amounts of radiation into the pacific ocean. Based on what I've read of dying ocean life it's probably one of the worst environmental disasters in history
There are 187000000000000 million gallons of water in the Pacific Ocean, meaning that "radioactive waste" is 2.1 x 10^-10% of the body of water itself. The waste takes the form of HTO, tritiated water, which has extremely low bioavailability and has already been introduced in abundance to the ocean by other events. There are things to be alarmed about, but this isn't one of them.
Decommissioning a damaged reactor is insanely expensive and, in too many cases, not possible. Chernobyl is at best contained, not cleaned up. 30 years after the fire and meltdown, the giant curved roof built to cover it was moved into position last year. That will hold things for a few more decades. The next generation will have to work out the next steps.
The jammed AVR pebble bed reactor in Germany is contained, but there's no way to dismantle and dispose of it. Current plans are to wait 60 years and then try to figure out something.
Fukushima’s mess will also take decades. The site generated huge amounts of contaminated water, and there's now a processing plant to take most radioactive contaminants out of the water. The water still has tritium after processing. The half-life of tritium is 12 years, so that will cool off in a few decades. Meanwhile, huge farms of water tanks store the stuff. Lots of radioactive dirt has been dug up and buried deeper somewhere else. There's a refrigerated "ice wall" to try to stop leakage into the ocean. Not much has been done with the reactor vessels themselves. As today's story reports, they can't even get a robot close to the reactor vessel.
This makes one very discouraged about nuclear power.
Solar panels folks. Solar panels. Wind. Batteries. We can do it, no problem. Electricity costs a little more, but your car ends up costing less. Less maintenance on fewer parts - just swap out the battery which declines in cost each year.
Keep fossil fuels and wood for the things solar & wind will have trouble covering: airplanes, ocean shipping, winter heating.
You only need the first 15 minutes of this talk to see why current renewables won't scale. The world needs something like 15 terawatts. When this talk was recorded, solar electricity produced something like .001 TW. We're now closing in on .03 --- and there are huge parts of the world that still haven't industrialized yet. We'd apparently need to be printing and distributing and deploying solar cells the way we do newspapers to have any hope of using photovoltaic to offset our total energy demand.
Nuclear doesn't scale either, but the notion that we're all just holding back on renewables out of greed because all it takes is wind and solar... well, it's easily refuted.
Fortunately we don't have to cover all shortfall with just PV. Increasing energy efficiency and reducing consumption (including personal car ownership) in the first world can get us there the rest of the way.
Our descendants have to do it. Coal, oil, and natural gas have maybe a century left. Fracking made more hydrocarbons accessible, but the supply is still quite finite.
We can't do the thing suggested upthread, eliminating dependence on all consumable energy sources by harnessing current wind and solar technology. Ultimately I agree that we have to find a way to do it. But there's a virulent belief that we can do it with technology that is being produced today, or might be produced in the near future if demand changed. No.
Tritium & tritiated water can be somewhat easily separated from water. Just freeze the mixture, and it forms layers. However, there probably is quite a bit of tritiated water.
However, there probably is quite a bit of tritiated water.
Yes. Here's the tank farm.[1]
It could probably be let out into the ocean without much harm, if not done all at once. But there's opposition to that. Meanwhile, it decays with a half-life of 12 years, so eventually it will be harmless. Frustratingly, the concentration of tritium is too low for commercial extraction.
>This makes one very discouraged about nuclear power.
Yes, I used to think it can be made safe, but we are just hoping that we will be able to handle the waste later. Then if there will be a 100.000 year accident every 10-15 years, it will be a rather depressing future...
I think that solar must be the way to do it. If we can't make do with solar, we won't be able to sustain the current population size.
It's always been a multi-decade cleanup project, but recently the estimates were pushed further into the future (~40 years). Also, the last article I'd read puts it at ~$180 billion. I can easily see that double before the end.
I'm wondering, can't they use the radiation as an alternative to light (photons), in that environment? I'm guessing the radiation would scatter in a way comparable to light, so you would be able to get a similar kind of "imaging". Not sure how one would implement a radiation detector in silicon, but at least you could shield it by some layer of metal to get out of the range of overload and into the range of sensitivity.
High energy photons are quite different compared to visible light. They scatter instead of being reflected. They have a low cross-section which means you can't make tiny pixels. No reflection or diffraction means no optics.
Are there any startup robotic companies that are doing robots that can deal with hard radiation? It seems that they are trying to wash off some melted gunk that is blocking their way into the area beneath the vessel where the core is suspected to be.
Will building a robot capable of completing such a task under such an environment attract any funding? Anywhere to apply?
Money can't be the reason. There has been an investment surge into radioactive cleanup technology starting with the damaged reactors six years ago.
Six years and billions of dollars later we get a robot that gets stuck and fails after just two hours. From what I've gathered they didn't even include a proper radiation sensor but used the camera noise in order to "guesstimate" the dosage after the fact. Did they not expect high radiation in that location?