Sunday, September 23, 2012

FUKUSHIMA EMISSIONS FAR GREATER THAN REPORTED


Ex-SKF Tells Us:
(Now They Tell Us) 1,590 Microsieverts/Hr in Futaba-machi, #Fukushima on March 12, 2011, Before Reactor 1 Explosion, and Vent, Not Explosion, May Have Caused High Radiation

Majia here: Much of the supposed data we have on Fukushima fallout comes from scientific studies that used computers to ESTIMATE fallout based on input data provided by Tepco and gleaned from the carefully censored SPEEDI data.

One has to wonder how many scientific studies out there with computer generated estimates are flawed because the INPUTS were grossly UNDER-ESTIMATED, both deliberately and accidentally by scientists working with Tepco and Speedi data.

The new data provided by Ex-SKF state that radiation levels exceeded 1.5 millisieverts an hour in Futaba Machi.

Levels like that are deadly serious. 

It is sickening that computer-generated estimates of fallout will be used to deny that adverse health effects were caused by exposure.

The best data are from studies that actually MEASURED contamination.

Here is one example of an EMPIRICAL STUDY that MEASURED Radio-Xenon in the Pacific Northwest. 

Justin McIntyre, Steve Biegalski, Ted Bowyer, Matt Copper, Paul Eslinger, Jim Hayes, Derek Haas, Harry Miley, J.P. Rishel, Vincent Woods “US Particulate and Xenon Measurements Made Following the Fukushima Reactor Accident. Available http://www.batan.go.id/inge2011/file/day1/1650_mcintyre.pdf. (These findings were also published in peer reviewed scientific journals -- see bottom of this post for citations).

THIS STUDY FOUND found radioxenon levels in the Pacific Northwest 450,000X normal for over a month.
 
Radioactive gasses are not good for your body. They emit beta particles and gamma when they decay.
 
How much radiation was really released? How much radiation is STILL BEING RELEASED with the ONGOING FISSION ACTIVITY AT THE PLANT?
 
Yes I know I'm shouting but I'm absolutely furious that the worst environmental disaster in human history is being HIDDEN FROM VIEW with both the knowing and also UNWITTING complicity of scientists.
 

Xenon Gas and Lethal Doses by Inhalation: 400.8 Million Lethal Doses


(this site is in French so I used Google translate)

The AIPR or International Association for the Protection Against Ionizing Radiation posted this on Nov 4 (please note it is translated from the French)

"The program "semi formalized" in Fukushima of 1.67 E19 Bq 133 Xe gas have meant the spread of 2.411 kg. This emission is equivalent in terms of internal radiation lung 400.8 million in potentially lethal doses by inhalation. As shown, 2.4 kg of material having the power to kill only transient 400 million people, the Xenon 133 is classified as very low radiotoxic. It's not beautiful science!


"The Xe 133 has a half-life of 5.244 days. It decreases in beta mode with a decay constant of 1.52985 s-1 E-6. (Ln (2) / T1 / 2 ie 0.693147 / 5.244 * 24 * 60 * 60 = 1.52985 E-6. ) The specific activity of E15 6.9271 Bq / g E05 Ci/gr.- -1.872 (6.0221415 E 23 / ​​133 * 1.5299 = 6.9271 E-06 E15 Bq / gr.) The inhalation dose factor is of 1.20 E-10 Sv / Bq. 
 
A lethal dose by inhalation of 5 Sievert E-06 weighs 6.02 gr. (6, o2 micrograms), E10 is 4.17 Bq. One gram of 133 Xe has provided 166,249 potentially lethal doses. Fukushima has released about 2411 grams of the radioactive element has so far scattered in the environment the innocent and very low radiotoxic equivalent of 400.8 million lethal doses."
 
 

ADDITIONAL RESOURCES
S. Biegalski, et al., US Particulate and Xenon Measurements Made Following the Fukushima Reactor Accident, accepted for publication in Jour. of Envir Radioactivity, 2011
•T. Bowyer, et al., Elevated Radioxenon Detected Remotely Following the Fukushima Nuclear Accident. Jour. of Envir. •Radioactivity 102 (7):681-687. doi:10.1016/j.jenvrad.2011.04.009
•P. Eslinger, et al., Source Term Estimation of Radioxenon Released from the Fukushima Daiichi Nuclear Reactors Using Measured Air Concentrations and Atmospheric Transport Modeling, to be submitted in Jour. of Envir. Radioactivity, 2011

 


10 comments:

  1. The SPEEDI data is only as good as the source term of the radiation that is plugged into it.

    I have seen a picture of steam & smoke coming out of stack #1 from the time in between the earthquake hit and the tsunami. It was probably spitting out radiation before the earth stopped trembling.

    ReplyDelete
  2. http://arxiv.org/pdf/1209.0648.pdf
    Since proportionality is obtained by emulating experiments of limited temperature ranges and modelling them according to known values, let us try the following model: rotate the y-axis by 180 degrees to have zero on top and 27 at the bottom. A hyperbola y = contant / x is what you will see from modelled figure 6. There are many constants found in models: (also try y = x / constant) 2nd order temperature dependency, const = 4.19 J / (cm x cm x cm x s x K x K), specific heat capacity at constant pressure determined by value cp = 6 cal/(mol x deg), heat capacity = 11.5 J/(K x mol)… which takes us to another simple hyperbola z = En / kT. Co-joining the exact determined modelled value to respective x and y axis at the part closest thereby where the ranges are terminated, leaves us with the following assumption: there is infinity above the graph and (1 / infinity) below the graph. With respect to each other: infinity minus (1 / infinity) = infinity. Values above the arranged collection of points are not affected below them, since they are not connected in any manner and cannot be connected, is what the assumption imply. Very few accurate experiments were able to be performed to determine constants of high temperature due to limitations.
    The scattering of the i th fissile nuclide is i th catastrophic.

    ReplyDelete
  3. Changing the gradient of a straight line while plotting it results in a hyperbola, when the product of your values is a constant. In addition to this: the perfect mathematical model always assumes infinity, therefor truncating the graph at a low value immediately changes your value to a 1/infinity value. You might find yourself swopping the x or y value with a constant while using the constant for the x or y value
    as it changes out of range. In essence a hyperbola insists that a high value of x is a low value for y and vice versa except in the low values for both. Rotating the axis might give you that effect, provided that you rotate it at the correct place and calibrate the current highest and lowest value. When both values increase simultaneously the hyperbola theory crashes as much as the line theory crashes when the gradient changes and there is the logarithmic theory etc. The problem is scale, and nuclear disaster is scalar problem to take note of. Now try |ABSOLUTE CONSTANT| = XY.
    Thank you.

    ReplyDelete
  4. Example of calibration and turning of axis for Figure 6:
    Turn the graph 90deg anti-clock wise. Calibrate the x-axis from 0 to 52.5 where the
    graph will become out of range after 28. Start with zero left and 52.5 at the right end.
    Keep the same graphical lay-out as these will be converted later.
    Complete a mirror image of the right part of the graph at the following point:
    T at 121.692 K and heat capacity cp = J/(mol • K) at 23.437 by re-scaling both axis to a maximum of 15. In this new range you have kept the graphs in position and they are as follows:
    Line y = mx + c where m = -1.7 and c = 13.28
    Hyperbola xy = 1.7
    These intercept at the following point:
    x2 -7.8x + 1 = 0
    Re-scaling the root 13.08 to the initial scale is (13.08/15) x 1400 = 1220.8 K
    where red and blue collide. This method is very useful if you don’t have Mathematica 5.2-6.0, Maple 10, or Matlab 7.0 at home utilizing multiprocessor calculations or aren’t able to use a Runge-Kutta method of 8-9th order and the numerical methods of lines or are not a graduate. The numerical error estimate didn't exceed 0.015% in addition to the existing 0.01% at the high point of interception.
    We now have confirmation of figure 6 and it doesn’t help us at all.

    ReplyDelete
  5. http://arxiv.org/pdf/1209.0648.pdf

    The power function model applied to figure 6 yields a different outcome:
    heat capacity cp = 35 J/(mol • K)
    thermal conductivity = 46 W/(m • K)
    temperature = 1526 K
    where graphs collide. Figure 5 and 6 are both out of range for those values.
    Playing with models is certainly not a good thing when it comes to accuracy, but compare this to the green graph of figure 2.

    ReplyDelete
  6. Werner can you translate your analysis for readers.

    What are the implications of your calculations?

    What do they tell us about Fukushima?

    Thank you

    ReplyDelete
  7. For Plutonium with a melting point of 912.5 K, and a constant volume molar heat capacity of 35.5 J/(mol·K) at 298 K , we will be out of range for Figure 6 as early as 289 K, with significant volume changes between 300 K and 1000 K also drastically changing approximations.

    ReplyDelete
  8. Majia, Mr Arnie Gunderson were referring to temperatures above boiling points.
    The characteristic of the graphs changes several times during the meltdown process
    and many constants have been derived for low temperatures which don’t apply for high temperatures. It not rare to see speculations being underestimated in all calculations from cancer fatalities to Bq to Sv. Although uneducated myself and
    not by authority to criticize the work of others, much more energy could have been released than what calculations can provide. Some calculations have possibly given
    a ‘divide by zero’ error in the past, causing them to devise a workaround instead of believing it is impossible according to the formula. The methods described are not transportable anywhere else, but are easy and applicable to show a continuation of a pattern up to the melting point. Constants are intertwined in an endless bucket of formulas. Change one constant and you have to change them all. One small estimation error per atom can result in millions dying.

    ReplyDelete
  9. Thank you very much for explaining Werner.

    The theorists' abstract numerical representations of death-causing elements rationalize nuclear and allow the psychological detachment necessary for the perpetuation of this insanity.

    Thank you for unpacking the fundamental uncertainty at the heart of these detached mathematical formulations.

    Max Weber was correct about the potential for evil in bureaucratic rationalization.

    ReplyDelete
  10. http://neucleardisaster.blogspot.com/

    ReplyDelete

Note: Only a member of this blog may post a comment.