• Thank you for visiting the Cafe Rad Lab Forum
  • We present & discuss radiation health, science & news
  • To keep you informed about vital nuke information.
Hello There, Guest! Login Register


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
EPA RADNET Discussion
#1
Fukushima Update: "Your Radiation This Week, Sept 19 to Sept 26, 2015"  http://www.coyoteprime-runningcauseicant...is_26.html

Data from US EPA RadNet with deeper dialog and explanation on radiation contamination, measurement, history, etc.. References included.
Pia
just pm me if needed.
 
Reply
#2
(09-26-2015, 01:06 PM)piajensen Wrote: Fukushima Update: "Your Radiation This Week, Sept 19 to Sept 26, 2015"  http://www.coyoteprime-runningcauseicant...is_26.html

Data from US EPA RadNet with deeper dialog and explanation on radiation contamination, measurement, history, etc.. References included.

Very skeptical of these reports

From the link - "These are the recorded Radiation Highs that affected people this week around the United States and in your neighborhood."

To me that means, purposefully selecting the biased results.  What I mean is that over the course of average operation the operator should not be surprised to see random high and low readings.  These monitors are continuously running, but there is no way we can say that an radiological event is happening just because we get 1 high reading out of 100, 500, or a 1,000 readings - because they go against the average.  What we want to do is identify between real readings and erratic readings.  See the image below (I did not generate this information, but it demonstrates my point).

   

In the image above there are 4 erratic spikes.  The first and third spike are followed by an erratically low reading (which would make me further skeptical).  The second and fourth spikes would still be excluded when finding an average, but they would require more analysis to prove they were a valid response.

When we collect data for scientific purposes, the first thing you want to do is to factor out these random readings unless there is a substantive reason for including them.  If we are counting the same thing over and over 12 times, it would not be out of line to exclude the highest (or two highest) readings and the lowest (or two lowest) readings to get a better understanding of the readings.

So to bring this back full circle, I don't think it's that useful to cherry-pick only the high readings and report those as absolute values.  

Lastly, the comment from the link "Note that these numbers will continue to climb, since radiation is cumulative, and half-lives of some types are millions of years."  is not correct.  Yes the half-lives of isotopes can extend for hundreds of thousands of years, but even if we do accept these "high value readings" as VALID readings and not some kind of error (which I don't) that does not allow us to conclude these "high value readings" will continue to climb uniformly and never go back down.  These materials have a propensity to travel and migrate and unless there are programs in place to prevent this transport they will actually spread out according to the most available transportation path.  

I will try to use a basic illustration to demonstrate what I am saying.  Let's imagine there is a monitor set up that is taking continuous readings.  And let's further imagine that I want to manipulate the readings measured by that monitor and I take a baggie of low-level contaminated dirt and dump it on this monitor set up outdoors.  Sure this may cause a spike in readings but the longer the materials sit on top of the monitor the more they will be affected by natural environmental processes, rain and moisture will cause the materials to migrate, winds will blow them around, someone may even come along and see this dirt on top of the monitor and brush it off into the environment.  It doesn't take very long for this concentrated bag of low-level contaminated dirt to spread around in the environment.  The more it spreads, the lower the levels measured by the detector will be.  So in fact, the measurements would not continue to climb, they would peak and gradually decline.  Now what would have happened in this hypothetical situation is that additional radioactive materials would've been introduced into the environment and in high enough concentrations could affect the natural background levels.

There are two ways to communicate what the author is trying to say, a correct way and an incorrect way.

The incorrect way -  "Note that these numbers will continue to climb, since radiation is cumulative, and half-lives of some types are millions of years." Again, this implies that these erratic high readings will be even higher if the same location is visited in a month or a year, and that can't be proven and shouldn't be argued.  

The correct way - "Global background levels of radioactivity will increase over time as more and more medical, industrial, and military applications of nuclear energy release radioactive materials into the environment through standard operations or accident."

Now I am NOT saying that we shouldn't monitor all available sources of radiation readings that we can verify. But what I'm saying is that random reported numbers mean nothing if not given in context.

If I were this author, instead of cherry picking the highest values I could find, I would create running averages by city on a daily, weekly, and monthly basis (which would help factor out environmental processes which could cause changes in background levels), then I would graph those monitors so that I could track trends over time. Then I could say CITYA averaged XXX cpm for the first three months of the year, but in April and May this city all of a sudden started averaging XXXX cpm, what happened?
"All models are flawed, some are useful."
George E. P. Box
 
Reply
#3
Pertinent analysis. What does this say about US EPA reporting and capacity of end users to correctly interpret "current" data and trends?
Pia
just pm me if needed.
 
Reply
#4
I don't yet have the equipment to measure radiation for myself and look to Radnet or Netc.  Radnet often has gaps or gets turned off and now they're redoing their website so nothing works over there.  Are the graphs showing the background variation, with spikes being readings above background?  Graphs at Radnet show gamma in 10 energy ranges.  Can I determine which radionuclide from the energy range?    Could you shed some light on Radnet's graphs?

Radnet shows me a gross beta reading in cpm.  Three years ago Radnet had readings of 30 cpm Beta for my area.  A year and a half ago the readings ranged from 100 to 300cpm.  Veterans Today has been posting current Beta cpm readings ranging from 300 to 2,000.  "This is per the USA EPA Radnet radiation data - CPM=counts per minute:  50 CPM and over is abnormal - 100 CPM alert level - 300 CPM evacuation or hazmat."  Why are Beta readings increasing?  Should we begin evacuations of North America?  

I see many attempts by the nuclear industry to change permissible radiation exposure levels.  Is this because levels have been increasing and we are now exposed to more ionizing radiation than current levels allow?
"The map is not the territory that it is a map of ... the word is not the thing being referred to."
 
Reply
#5
(09-26-2015, 03:42 PM)piajensen Wrote: Pertinent analysis. What does this say about US EPA reporting and capacity of end users to correctly interpret "current" data and trends?

RADNET is not the system that it is sold to the public as, even government officials can admit that.  It is not designed for a member of the public to log on and get meaningful information that is easy to understand.

http://enformable.com/2011/09/7157/

The EPA doesn't manage the system like a critical asset to national security.  On March 11, 2011, at the time of the Japan nuclear incident, 25 of the 124 installed RadNet monitors, or 20 percent, were out of service for an average of 130 days.

[Image: Analysis-of-25-Out-of-Service-Monitors-a...25x499.png]
The operators of the individual RADNET stations are manned by unpaid volunteers, who are supposed to perform tasks like changing the filters every two weeks, but that doesn't always happen.  Seven of 12 monitors were broken for an extended period and thus did not have filter changes or submit data for 21 to 339 days. For the 1-year time span, 42 percent of required filter changes did not occur because of broken monitors or volunteer issues.

An Inspector General report (see below) says that the EPA’s RadNet program will remain vulnerable until it is managed with the urgency and priority that the Agency reports it to have to its mission, and that is also reflected in the National Response Framework for Nuclear Radiological Incidents.

http://www.epa.gov/oig/reports/2012/2012...P-0417.pdf

Now it should be remembered "Just because you can't do EVERYTHING doesn't mean you can't do ANYTHING."  

Just because RADNET is incapable of doing what it is marketed as being able to do, doesn't mean it's not useful for anything, you just have to work to get some data out of it, and the data may not be the best -- but hopefully it points you in the right direction.
"All models are flawed, some are useful."
George E. P. Box
 
Reply
#6
I am baffled. Confused

Everybody agrees that radiation can cause health problems and even death with exposure. The EPA is charged with protecting health and the environment.
  • 2013: Hughes Network Systems has received an order from the Environmental Protection Agency (EPA) to provide satellite broadband services for the agency’s RadNet program http://www.satellitetoday.com/telecom/20...to-hughes/
  • 2011: Environmental Dimensions, Inc. (EDI), is actually in charge of maintaining the RadNet system http://www.naturalnews.com/032525_EPA_RadNet.html
  • 2011 Environmental Dimensions, Inc (EDI) has provided maintenance for EPA’s RadNet monitoring systems under a sole source contract https://pstuph.wordpress.com/2011/04/04/...nt-page-3/ The contract was awarded to what is stated as a “Woman-owned 8(a) Small Disadvantaged Business“.  The disadvantaged woman in this case is EDI company president Patricia S. Bradshaw, former Deputy Under Secretary of Defense appointed by George Bush.
Certainly there is no good reason to rely upon volunteers when it appears that the contractors are qualified and have been paid quite handsomely to supply and maintain services.
Pia
just pm me if needed.
 
Reply
#7
(09-27-2015, 07:51 AM)Horse Wrote: I don't yet have the equipment to measure radiation for myself and look to Radnet or Netc.  Radnet often has gaps or gets turned off and now they're redoing their website so nothing works over there.  Are the graphs showing the background variation, with spikes being readings above background? 

I'm breaking your questions into a few responses.

Radnet often has gaps or gets turned off -- I would add or is improperly monitored by volunteers.  The data is not really valid scientifically, it is more designed to be an early detection system.  

Let's talk about background for a second.  When you operate a detector, the first and most important thing to know is how your detector operates.  You can only learn this by operating it over time and in different circumstances so that you know what it looks like during normal operations and when it is exposed to radioactive materials.  The background levels are the levels registered during normal operations and are averaged over time to give you a better understanding of what you can expect to see on the average day, but in reality they flux from measurement to measurement, hour to hour and day to day.  I will demonstrate this below.

I don't really care about absolute values, because without context (understanding the detector, normal levels during operations, how the measurement was conducted, what variables need to be taken into consideration, etc.) they really mean nothing.  Science must be repeatable by other people who have the same or similar equipment and go through the same procedures.  I am more interested in the variations in values (what's the lowest, whats the average, what are the highest) and understanding why those variations are taking place (is it just because of some natural process ie the wash-out of radon with rain or because of some unconsidered variable).  No two detectors are exactly alike, they all have their own characteristics and operating parameters.  I have many 2x2 NaI probes, they all operate best at different High Voltage settings, they all have different background levels in cpm, and they all perform differently (in terms of efficiency). 

If someone calls me and says, I just measured 500 cpm with my detector, there's not a lot I can do just with that one measurement.  I need to know what detector are you using, what does its normal operating readings look like, what happened, what's the normal readings, did you do anything different than you normally would, etc?

Let me give you an example from my actual data about determining the actual background levels of a detector.  I keep detailed notes of each detectors responses during operations, so I can track variations over time.

In this particular example I was working with a Ludlum Model 2000 scaler and an alpha scintillator.  This is a desktop unit, meaning it stays in one position in the lab, I want to limit the potential variables that could affect the readings. When using a scaler like this I am counting gross counts (all the counts) over a specific period of time (1 minute, 1 hour, etc).  I need to understand what number of the counts is coming from just operating the detector and what counts are actually coming from the sample I am looking at, so I perform a number of runs with no sample in front of the detector and I average out what the background is.  It is important to not change anything during this time that would affect the counts, don't move the machine, don't change the amount of time you are analyzing, etc.  Do everything the exact same way you will be analyzing the sample.  

In this particular scenario, I was letting the scaler operate for 60 minutes.

Hour 1 - 106 counts
Hour 2 - 102 counts
Hour 3 - 104 counts
Hour 4 - 110 counts
Hour 5 - 89 counts
Hour 6 - 113 counts
Hour 7 - 89 counts
Hour 8 - 101 counts
Hour 9 - 83 counts
Hour 10 - 95 counts
Hour 11 - 86 counts
Hour 12 - 104 counts
Hour 13 - 113 counts
Hour 14 - 103 counts
Hour 15 - 91 counts

Now a quick average of all of those readings would tell you that this particular set-up could be expected to yield 99.26 counts in a 60 minute span -- or 1.65 counts per minute.

If I wanted to ensure that my readings weren't being skewed by erratic readings (extremely high or extremely low) I could check to see what the averages are after eliminating the two highest and two lowest readings.  In which case the counts would look like:

Hour 1 - 106 counts
Hour 2 - 102 counts
Hour 3 - 104 counts
Hour 4 - 110 counts
Hour 5 - 89 counts
Hour 6 - 113 counts
Hour 7 - 89 counts
Hour 8 - 101 counts
Hour 9 - 83 counts
Hour 10 - 95 counts
Hour 11 - 86 counts
Hour 12 - 104 counts
Hour 13 - 113 counts
Hour 14 - 103 counts
Hour 15 - 91 counts

Now my averages would be - 99.45 counts per 60 minute run, which still works out to be 1.65 counts per minute.

Based on these tests I would feel like I had a good starting point and could start putting samples through the machine.

If I wanted to, I could also perform 3 or more background runs per day and track my background over days, weeks, months, years, etc.

So yes you see fluctuations in the EPA data, that is not a bad thing, actually it allows you to average the background for a particular detector, what you really want to know is how much is the variation and what does it mean (if anything)?   

Personally, I'm most interested in logarithmic changes in data.  Is it 2 times background, 5 times background, etc.  I want to know that what I'm looking at is not an anomaly.
"All models are flawed, some are useful."
George E. P. Box
 
Reply
#8
(09-27-2015, 09:12 AM)piajensen Wrote: I am baffled. Confused

Everybody agrees that radiation can cause health problems and even death with exposure. The EPA is charged with protecting health and the environment.
  • 2013: Hughes Network Systems has received an order from the Environmental Protection Agency (EPA) to provide satellite broadband services for the agency’s RadNet program http://www.satellitetoday.com/telecom/20...to-hughes/
  • 2011: Environmental Dimensions, Inc. (EDI), is actually in charge of maintaining the RadNet system http://www.naturalnews.com/032525_EPA_RadNet.html
  • 2011 Environmental Dimensions, Inc (EDI) has provided maintenance for EPA’s RadNet monitoring systems under a sole source contract https://pstuph.wordpress.com/2011/04/04/...nt-page-3/ The contract was awarded to what is stated as a “Woman-owned 8(a) Small Disadvantaged Business“.  The disadvantaged woman in this case is EDI company president Patricia S. Bradshaw, former Deputy Under Secretary of Defense appointed by George Bush.
Certainly there is no good reason to rely upon volunteers when it appears that the contractors are qualified and have been paid quite handsomely to supply and maintain services.

The volunteers operate the machines, check on them, swap out filters and send them to labs, etc.  The contractors are in charge of getting the units deployed, performing maintenance on them, deciding which labs will analyze which samples, etc.  It's not a perfect program and there are many lapses in my opinion.
"All models are flawed, some are useful."
George E. P. Box
 
Reply
#9
(09-27-2015, 07:51 AM)Horse Wrote: Graphs at Radnet show gamma in 10 energy ranges.  Can I determine which radionuclide from the energy range?    Could you shed some light on Radnet's graphs?

I sure can try.

Each radionuclide emits gamma rays with a specific energy (generally measured in keV or meV), which can be used like a fingerprint to identify it. This data can be collected and displayed in a spectrum which will show peaks that can be identified as specific isotopes with the right information and training.  By studying the radiation energy spectrum, experts can discern which radionuclides are present around a monitor.

RadNet organizes the gamma energy spectrum into ten contiguous ranges, termed Regions of Interest (ROIs). The fixed monitors report total gamma detections (counts) within each ROI. By tracking the long-term changes of values within each ROI at each monitor, one can roughly determine the increase or decrease of radionuclides (within a certain energy range) at a specific location. So it will not allow you to determine exactly which energy and therefore exactly which isotope in particular, but you should be able to identify is it coming from a range which includes cesium, iodine, etc?

Of course the EPA has removed this page from their website, but I was able to get an old cached version up, but the images which were once hosted are now gone so you have to settle for this screenshot which is not very detailed (blame EPA not me!).

   

In this image on the far left you can see the spectra generated by the detector by hour.  (This WOULD be the most interesting data to me, but of course the EPA doesn't release it publicly -- if they were to release it then anyone who knew how to interpret spectra could tell you exactly what isotopes were in the area at that particular time)  

In the middle image, the EPA has separated the spectra into the 10 regions of interest and displayed them as line graphs.  So now you can see if spikes are occurring in one or more ranges.  (The EPA makes this available, but is really only useful when there has been an event involving the release of radiation, and does not allow you to determine which exact isotopes are present, only those within a range of energy)

On the right we can see essentially the gross (total) counts regardless of energy range.  This is essentially the overall gamma background.  (The EPA makes this available)
"All models are flawed, some are useful."
George E. P. Box
 
Reply
#10
I have updated the title of this thread to EPA RADNET Discussion
"All models are flawed, some are useful."
George E. P. Box
 
Reply
#11
H;

To better answer your beta question, can you give me the closest radnet sampling location to you? Or even better the one with the beta flux?
"All models are flawed, some are useful."
George E. P. Box
 
Reply
#12
(09-27-2015, 09:22 AM)LWH Wrote: I'm breaking your questions into a few responses.

Radnet often has gaps or gets turned off -- I would add or is improperly monitored by volunteers.  The data is not really valid scientifically, it is more designed to be an early detection system.  

Might have given us early warning if it had not been shut off after Daiichi meltdowns.  


(09-27-2015, 09:22 AM)LWH Wrote: Let's talk about background for a second...

If I wanted to, I could also perform 3 or more background runs per day and track my background over days, weeks, months, years, etc.

So yes you see fluctuations in the EPA data, that is not a bad thing, actually it allows you to average the background for a particular detector, what you really want to know is how much is the variation and what does it mean (if anything)?   

Personally, I'm most interested in logarithmic changes in data.  Is it 2 times background, 5 times background, etc.  I want to know that what I'm looking at is not an anomaly.

I can apply this analysis to the sparks counts I've done from the recordings.  I've seen distinct changes in activity rates over time.  Taking counts at the same time each night would help eliminate other factors.  


(09-27-2015, 10:06 AM)LWH Wrote: H;

To better answer your beta question, can you give me the closest radnet sampling location to you?  Or even better the one with the beta flux?

Denver would be the closest.  Altitude and natural radioactivity of mountains were the reason for our 30's readings when others were reading in the 10's.
"The map is not the territory that it is a map of ... the word is not the thing being referred to."
 
Reply
#13
Ok give me a couple hours, I have some things to complete and then I will graph the beta data from the Denver station so we can discuss
"All models are flawed, some are useful."
George E. P. Box
 
Reply
#14
Downloading data froNETm RAD is a terrible struggle with an archaic system. One can only download 16 days worth of data at a time. This doesn't make it easy to use the data as a tool. Plus the download function is buggy and prone to server errors. I don't know who designed this system...but c'mon!
"All models are flawed, some are useful."
George E. P. Box
 
Reply
#15
Intrepid investigator has problems getting data from Radnet. System designed to stymie research. Tedious work yields unreliable results, helps keep radiation invisible.
"The map is not the territory that it is a map of ... the word is not the thing being referred to."
 
Reply
#16
(09-27-2015, 07:15 PM)Horse Wrote: Intrepid investigator has problems getting data from Radnet.  System designed to stymie research.  Tedious work yields unreliable results, helps keep radiation invisible.

Unfortunately RADNET (like the Canberra BABYSCAN Whole Body Counter) is more of a political tool than a scientific one.
"All models are flawed, some are useful."
George E. P. Box
 
Reply
#17
(09-27-2015, 07:51 AM)Horse Wrote: Radnet shows me a gross beta reading in cpm.  Three years ago Radnet had readings of 30 cpm Beta for my area.  A year and a half ago the readings ranged from 100 to 300cpm.  Veterans Today has been posting current Beta cpm readings ranging from 300 to 2,000.  "This is per the USA EPA Radnet radiation data - CPM=counts per minute:  50 CPM and over is abnormal - 100 CPM alert level - 300 CPM evacuation or hazmat."  Why are Beta readings increasing?  Should we begin evacuations of North America?  

Ok so not to keep picking on RADNET, but here is another demonstration of why it is just not a very good scientific tool.

Forget about the fact that you can't actually access the data you need easily, the times when they report or don't report value also make it difficult to analyze.

My thinking was that there would be a brief latency period after the Fukushima Daiichi disaster before the beta levels could be expected to increase in the United States.  So I chose to select a window (April - because it was between 3-6 weeks after the start of the release) that would allow time for materials to make the trans-Pacific voyage.  Unfortunately as I mentioned you can only export 400 events at a time from RADNET, and with 24 hours in a day, it ended up giving me from 04/01 to 04/18 but it would stop in the middle of the day on the 18th.  So I cut it back to the 17th, so we would have full 24 hour runs.

The other issue was that in 2009, 2014, 2015 there were no beta reportings at all.

In 2008 the detector was only active from 4/1 - 4/10.
In 2012, the detector was only active from 4/12-4/17.

Next, it wasn't very efficient to try and map all readings every hour, so I quickly (and roughly) averaged them per day.  (Added up sum of all readings available for that day and divided it by the number of readings taken that day)

   

The highest level detected in a day between all of the data collected was 4/3/15 with 173.6 cpm
The lowest level detected in a day between all of the data collected was in the same year 4/6/15 with 22.79 cpm

The average of all of the readings in this period taken (2008-2013) together was 76.19 cpm.  So to me, I would expect to see some 2x that and maybe even the random 3x that.  If it got to 4x that (over 300 counts) then I would have follow up questions to ask, (like what other variables aren't we considering?), if I got over 1000 counts I would first want to confirm the readings (have them check the detector to make sure it is properly responding) then work quickly investigate why.


Here's how the data looked;
   

Lastly, I also averaged all of the days to get rough picture if things really are increasing so dramatically.

Year  -  Average Beta 
2008 -  122.57
2009 - ???
2010 -  65.52
2011 -  78.11
2012 -  63.08
2013 -  62.28
2014 - ???
2015 - ???
 
I can't recommend evacuations based on that data, but in all fairness it also isn't very good data either.
"All models are flawed, some are useful."
George E. P. Box
 
Reply
#18
From the averages given it looks like everything got better after a 2011 blip that was far less than a previous high.  That doesn't seem realistic, a nuclear disaster and beta is barely raised and dropping off rapidly.  Anyone can quote part of a radnet report and say its going up or its going down without a thorough analysis.  Appreciate your insight.  Radnet isn't giving me any useful information.  Political motivations make publicly available data suspect or we're using the wrong tool.  Would gamma readings tell us more about nuclear releases?  If the instruments aren't detecting any significant changes, are they sensitive enough for that task?  I'm still sheltering in place till I know where to evacuate to.
"The map is not the territory that it is a map of ... the word is not the thing being referred to."
 
Reply
#19
Corvallis Oregon, 5 Years of Compiled Radiation Data, Nicely Plotted

http://www.fromthetrenchesworldreport.co...ent-962428
"The map is not the territory that it is a map of ... the word is not the thing being referred to."
 
Reply
#20
Some data from EPA.

interesting note, Anchorage scored the highest beta reading ever for the Pacific states, from the years 2011-2015.
1887 CPM in 2012 from RadNet.
The average that year was 68 CPM.
The five year average was 87 CPM.
Here are each years averages
2011 - 46 CPM
2012 - 68 CPM
2013 - 85 CPM
2014 - 71  CPM
2015 - 164 CPM
2016 - monitor data turned off

Graph resized:
   
 
Reply
  


Forum Jump:


Browsing: 2 Guest(s)