• Thank you for visiting the Cafe Rad Lab Forum
  • We present & discuss radiation health, science & news
  • To keep you informed about vital nuke information.
Hello There, Guest! Login Register


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
EPA RADNET Discussion
#21
Thank you for the contribution, Matt. It's really unfortunate that EPA can't keep their monitoring system fully operational. I consider their lack of adequate response a dereliction of duty. A suggestion on images - resize them 700x (pixels) and the full image will appear. Providing the source url for data is really helpful also so folks can go see such things as high radiation counts.

I just did a query for gross beta CPM for 1/1/2012 to 12/31/2012 and the system spit out 2008 data.
https://cdxnode64.epa.gov/radnet-public/query.do (query tech fail)
   
Pia
just pm me if needed.
 
Reply
#22
(10-23-2016, 09:55 AM)piajensen Wrote: Thank you for the contribution, Matt. It's really unfortunate that EPA can't keep their monitoring system fully operational. I consider their lack of adequate response a dereliction of duty. A suggestion on images - resize them 700x (pixels) and the full image will appear. Providing the source url for data is really helpful also so folks can go see such things as high radiation counts.

I just did a query for gross beta CPM for 1/1/2012 to 12/31/2012 and the system spit out 2008 data.
https://cdxnode64.epa.gov/radnet-public/query.do (query tech fail)

One never knows when the RadNet site will send you to another time, or place for that matter, when using the CDX site.
They just recently increased the download speed, going from the super slow 3-5 minute per download to a blazing 5 seconds! You can pull a full year off in 20 minutes now...so get to it!

Right now, you can only access Seattle and Honolulu back to 2010...all the rest on the west coast only go back to 2011.
I see some of you have been looking at data and trying to compare it on a download by download basis, going for as many years as you can. This doesn't work...trust me.

Radiation typically produces a bowl type graph, with the summer being the lowest and the winter being the highest... But it shifts around quite a bit, especially after 311, and you ALWAYS get blackouts when the readings reach a certain set point.

They actually use two high set points.
 One is a temp hold, lets say this is at 500, which they will show you if the readings don't hit a higher "no show" set point, lets say at 600. Data above that 600 point will never be seen by the public, and when the radiation levels drop down to below the temp hold set point at 500, they will then release the data between the two set points. This release of this held data is done within a few days AFTER the readings get below the 500 point. These blackouts are typically more than just an hour long.

The hour long ones, and there are many of those, are spikes, and should NOT be dismissed to quickly.
The Alaskan sites have quite a bit of those, spaced really close together, for weeks on end. Strange stuff!

A word on Beta.
 It is pretty much discontinued on the west coast now.
They claim cell tower interference. I think they have their scales in an uncalibrated state, judging by the zero readings that show up.
Beta is tough to control as you have that bottom line that doesnt move. Gamma floats and is easy to manipulate.
 Big jumps up and down are super common on most sites. Makes you wonder...

Basically what I have seen is a month long hit in the summer of 2011, a pause, another bigger hit in early 2012.
 With so many blackouts, it really makes me wonder if we haven't had a bunch of melts since 311.
 
Reply
#23
Thanks for the RadNet tech history lesson, Matt. One thing we can count on - there will be censorship.
Pia
just pm me if needed.
 
Reply
#24
(12-11-2016, 12:28 AM)matt Wrote: They actually use two high set points.
One is a temp hold, lets say this is at 500, which they will show you if the readings don't hit a higher "no show" set point, lets say at 600. Data above that 600 point will never be seen by the public, and when the radiation levels drop down to below the temp hold set point at 500, they will then release the data between the two set points. This release of this held data is done within a few days AFTER the readings get below the 500 point. These blackouts are typically more than just an hour long.

Using two high set points sounds like a way they could hide data.  Wouldn't the set points for an individual station show in the data?  Can the set points be determined?  From just looking once in a while at Denver readings over the last few years, it seems the average has gone up.  Highs in the 300s are now highs in the 600s.  

(12-11-2016, 12:28 AM)matt Wrote: A word on Beta.
It is pretty much discontinued on the west coast now.
They claim cell tower interference. I think they have their scales in an uncalibrated state, judging by the zero readings that show up.
Beta is tough to control as you have that bottom line that doesnt move. Gamma floats and is easy to manipulate.
Big jumps up and down are super common on most sites. Makes you wonder...

Basically what I have seen is a month long hit in the summer of 2011, a pause, another bigger hit in early 2012.  With so many blackouts, it really makes me wonder if we haven't had a bunch of melts since 311.

With all the beta testing being discontinued, as it was in Denver early on, I wonder what particles are floating around.  Storms in Seattle usually take two - three days to reach Denver so we get the Fuku stuff with a dash of Hanford vapor.  With the EPA radnet in such sorry state we won't get early warning of any major radiation releases that might occur.  Russia can't afford another meltdown to become public and China wouldn't say a word if something happened.  Radnet is just political propaganda to reassure the public; not a useful scientific tool to measure radiation contamination.
"The map is not the territory that it is a map of ... the word is not the thing being referred to."
 
Reply
#25
Even though EPA's RadNet can't be trusted, I do find their charts interesting when comparing locations. Starting with San Juan Puerto Rico...
               
Pia
just pm me if needed.
 
Reply
#26
(12-11-2016, 10:18 AM)Horse Wrote:
(12-11-2016, 12:28 AM)matt Wrote: They actually use two high set points.
One is a temp hold, lets say this is at 500, which they will show you if the readings don't hit a higher "no show" set point, lets say at 600. Data above that 600 point will never be seen by the public, and when the radiation levels drop down to below the temp hold set point at 500, they will then release the data between the two set points. This release of this held data is done within a few days AFTER the readings get below the 500 point. These blackouts are typically more than just an hour long.

Using two high set points sounds like a way they could hide data.  Wouldn't the set points for an individual station show in the data?  Can the set points be determined?  From just looking once in a while at Denver readings over the last few years, it seems the average has gone up.  Highs in the 300s are now highs in the 600s.  

The two high set points do just that...hide data that is too high, and yes...you can see the set points, but I have noticed they change from year to year and are getting slowly higher each year. Sometimes they miss cutting out
a high point, so the only way to really see it, is to download data on a daily basis. This flushes it out. With the quick
downloads today, you can pull off a year in about 20 minutes or so. It is fast!

(12-11-2016, 12:28 AM)matt Wrote: A word on Beta.
It is pretty much discontinued on the west coast now.
They claim cell tower interference. I think they have their scales in an uncalibrated state, judging by the zero readings that show up.
Beta is tough to control as you have that bottom line that doesnt move. Gamma floats and is easy to manipulate.
Big jumps up and down are super common on most sites. Makes you wonder...

Basically what I have seen is a month long hit in the summer of 2011, a pause, another bigger hit in early 2012.  With so many blackouts, it really makes me wonder if we haven't had a bunch of melts since 311.

With all the beta testing being discontinued, as it was in Denver early on, I wonder what particles are floating around.  Storms in Seattle usually take two - three days to reach Denver so we get the Fuku stuff with a dash of Hanford vapor.  With the EPA radnet in such sorry state we won't get early warning of any major radiation releases that might occur.  Russia can't afford another meltdown to become public and China wouldn't say a word if something happened.  Radnet is just political propaganda to reassure the public; not a useful scientific tool to measure radiation contamination.
RadNet is only an "early warning station" in that, when you don't see any data, you know it is high. Quite the Orwellian way to do it! Beta was nice, because it was always the first type of radiation to be detected during an incident. Very sensitive IF they had the scale and settings correct and calibrated. But they messed with all that in the end and the graphs came out looking like a bad joke.

   

(12-11-2016, 10:47 AM)piajensen Wrote: Even though EPA's RadNet can't be trusted, I do find their charts interesting when comparing locations. Starting with San Juan Puerto Rico...

I like using a larger scale...like this...Anaheim CA 2011.
 
Reply
#27
(12-16-2016, 03:13 PM)matt Wrote:
(12-11-2016, 10:18 AM)Horse Wrote: With all the beta testing being discontinued, as it was in Denver early on, I wonder what particles are floating around.  Storms in Seattle usually take two - three days to reach Denver so we get the Fuku stuff with a dash of Hanford vapor.  With the EPA radnet in such sorry state we won't get early warning of any major radiation releases that might occur.  Russia can't afford another meltdown to become public and China wouldn't say a word if something happened.  Radnet is just political propaganda to reassure the public; not a useful scientific tool to measure radiation contamination.
RadNet is only an "early warning station" in that, when you don't see any data, you know it is high. Quite the Orwellian way to do it! Beta was nice, because it was always the first type of radiation to be detected during an incident. Very sensitive IF they had the scale and settings correct and calibrated. But they messed with all that in the end and the graphs came out looking like a bad joke.

The less beta testing being done the more I worry.  Why does the EPA bother trying to maintain an early warning system then shut it off when the Fuku plumes arrived.  EPA fails a basic test when Fuku blew and a known radiologic plume was headed this way and there is no clear indication that anything occurred.  Radnet was supposed to give warning of atomic bombs but it can't give us any warning if it doesn't work.  Then, if we did have notice of a plume I can take my Iodine the day before the alert and be protected.  Imagine the clamor for potassium iodide if a dangerous plume was detected; and no supply and distribution system has been set up.  Nuclear war and nuclear power plant accidents produce the same kind of radioactive hazard.  The thoroughly gutted EPA is getting leadership that wants to do away with it completely.  Will radnet be gone or made operational?
"The map is not the territory that it is a map of ... the word is not the thing being referred to."
 
Reply
#28
Ya know... I'm wondering if, since the rad stations are manned by "volunteers," some entrepreneurial, well heeled sort couldn't take the job of maintaining the system. 

It'd be really cool if, say, someone or a consortium out of Silicon Valley pooled their talent and resources to keep US residents informed.
Pia
just pm me if needed.
 
Reply
#29
deleted
the iron fistee

"new posts you make must be approved by a moderator before becoming visible."

my only interest is in removing them now Wink




 
Reply
#30
That's intense. Thank you for bringing us this data visual. Look forward to seeing more from you. Thanks for joining in.
Pia
just pm me if needed.
 
Reply
#31
deleted
the iron fistee

"new posts you make must be approved by a moderator before becoming visible."

my only interest is in removing them now Wink




 
Reply
#32
Thanks for the info and the kudos, Rad. And, great that you take the initiative to ferret out the truth instead of blindly accepting what people in nuke industry say. Too many trust the "authorities."

Indeed, the US (and other countries, though not a lot of info on others' critical nuclear issues are published) are having a terrible time safely managing munitions waste. You illustrate one of the very basic points about nuclear radiation - there are no materials yet known to man which can fully contain radioactive emitters.

Even Finland admits that their deep repository may not be sufficient even though they put a lot of research and effort into design and infrastructure.

The whole space exploration concept is a huge distraction form the truth - corporate space mining. We seriously need to put those minds on tasks to help us make it into the next few hundred years, if we actually care about humanity. My dad had NASA as a client when he was one of many Vice Presidents at Batten, Barton, Durston & Osborne in Los Angeles. He managed their Apollo 11 marketing campaign. I would love to have seen his files.
Pia
just pm me if needed.
 
Reply
#33
deleted
the iron fistee

"new posts you make must be approved by a moderator before becoming visible."

my only interest is in removing them now Wink




 
Reply
#34
(12-23-2016, 11:36 AM)rad radio Wrote: Ok let's see if I can place the chart here (I simply overlaid all the Beta charts on top one another, producing a composite chart for the whole country)



the interesting thing about these charts are that with the whole series updated every 4 months, it was this time period when Beta charts started going down en masse :
Code:
03-02-2011 : 104 active - 20 dead


- - the composite chart was for this time period - -

12-01-2012 : 65 active - 59 dead

08-08-2013 : 62 active - 62 dead

08-21-2014 : 48 active - 76 dead

08-08-2015 : 37 active - 87 dead

02-20-2016 : 32 active - 92 dead

(as you can see the dead ones tripled after the above chart)

Also, there are no zero readings in the 03-02-2011 set on any of the charts


so, right as both the median and the background levels were solidly rising, that's when they accelerated pulling the charts, and they started by pulling the highest reading one's first - just as Horse suspected.  And we started getting zero readings, too (which is impossible - indicates error or tampering)

lots of errors or tampering

You have to remember that RadNets graphing system ALLOWS them to be able to re-set scales. The zero values show up BECAUSE they re-set the scale to a lower value during times of high beta emmissions. So a reading of, say, 200, would show up as 100, and a reading of 100 would show up as zero. Since they can't show us negative values, anything under 100 would then show up as zero. That explains why you get a string of zeros in a row...some positive numbers...and a string more. You can clearly SEE this IF you download the data into an excel spreadsheet and graph it out.

I forward these findings not on any one or just a few charts, but all of them all the time, because I have all of them all the time - right up to the time they pulled them ALL.

Does anyone have the actual complete dataset?  And does anyone want this to see for themselves?

7,110 Beta charts, sampled 61 times from 03-02-2011 to 09-02-2016  393.MB

Horse
(12-11-2016, 12:28 AM)matt Wrote: They actually use two high set points.
One is a temp hold, lets say this is at 500, which they will show you if the readings don't hit a higher "no show" set point, lets say at 600. Data above that 600 point will never be seen by the public, and when the radiation levels drop down to below the temp hold set point at 500, they will then release the data between the two set points. This release of this held data is done within a few days AFTER the readings get below the 500 point. These blackouts are typically more than just an hour long.

Using two high set points sounds like a way they could hide data.  Wouldn't the set points for an individual station show in the data?  Can the set points be determined?  From just looking once in a while at Denver readings over the last few years, it seems the average has gone up.  Highs in the 300s are now highs in the 600s.  

(12-11-2016, 12:28 AM)matt Wrote: A word on Beta.
It is pretty much discontinued on the west coast now.
They claim cell tower interference. I think they have their scales in an uncalibrated state, judging by the zero readings that show up.
Beta is tough to control as you have that bottom line that doesnt move. Gamma floats and is easy to manipulate.
Big jumps up and down are super common on most sites. Makes you wonder...

Basically what I have seen is a month long hit in the summer of 2011, a pause, another bigger hit in early 2012.  With so many blackouts, it really makes me wonder if we haven't had a bunch of melts since 311.

With all the beta testing being discontinued, as it was in Denver early on, I wonder what particles are floating around.  Storms in Seattle usually take two - three days to reach Denver so we get the Fuku stuff with a dash of Hanford vapor.  With the EPA radnet in such sorry state we won't get early warning of any major radiation releases that might occur.  Russia can't afford another meltdown to become public and China wouldn't say a word if something happened.  Radnet is just political propaganda to reassure the public; not a useful scientific tool to measure radiation contamination.

Denver most likely still gets alot of crap blowing in from the old Rocky Flats Plutonium Complex that has now been "decommisioned".
It also is higher up in elevation, and that in itself also raises the values from what I have seen.

(02-05-2017, 05:38 PM)matt Wrote:
(12-23-2016, 11:36 AM)rad radio Wrote: Ok let's see if I can place the chart here (I simply overlaid all the Beta charts on top one another, producing a composite chart for the whole country)



the interesting thing about these charts are that with the whole series updated every 4 months, it was this time period when Beta charts started going down en masse :
Code:
03-02-2011 : 104 active - 20 dead


- - the composite chart was for this time period - -

12-01-2012 : 65 active - 59 dead

08-08-2013 : 62 active - 62 dead

08-21-2014 : 48 active - 76 dead

08-08-2015 : 37 active - 87 dead

02-20-2016 : 32 active - 92 dead

(as you can see the dead ones tripled after the above chart)

Also, there are no zero readings in the 03-02-2011 set on any of the charts


so, right as both the median and the background levels were solidly rising, that's when they accelerated pulling the charts, and they started by pulling the highest reading one's first - just as Horse suspected.  And we started getting zero readings, too (which is impossible - indicates error or tampering)

lots of errors or tampering

You have to remember that RadNets graphing system ALLOWS them to be able to re-set scales. The zero values show up BECAUSE they re-set the scale to a lower value during times of high beta emmissions. So a reading of, say, 200, would show up as 100, and a reading of 100 would show up as zero. Since they can't show us negative values, anything under 100 would then show up as zero. That explains why you get a string of zeros in a row...some positive numbers...and a string more. You can clearly SEE this IF you download the data into an excel spreadsheet and graph it out.

I forward these findings not on any one or just a few charts, but all of them all the time, because I have all of them all the time - right up to the time they pulled them ALL.

Does anyone have the actual complete dataset?  And does anyone want this to see for themselves?

7,110 Beta charts, sampled 61 times from 03-02-2011 to 09-02-2016  393.MB

Horse
(12-11-2016, 12:28 AM)matt Wrote: They actually use two high set points.
One is a temp hold, lets say this is at 500, which they will show you if the readings don't hit a higher "no show" set point, lets say at 600. Data above that 600 point will never be seen by the public, and when the radiation levels drop down to below the temp hold set point at 500, they will then release the data between the two set points. This release of this held data is done within a few days AFTER the readings get below the 500 point. These blackouts are typically more than just an hour long.

Using two high set points sounds like a way they could hide data.  Wouldn't the set points for an individual station show in the data?  Can the set points be determined?  From just looking once in a while at Denver readings over the last few years, it seems the average has gone up.  Highs in the 300s are now highs in the 600s.

The two high set points DO hide data, hidden by eliminating that hours data point OR giving a blank value. They use both techniques. Remember...blank values are different from zero values...but you very very rarely ever see zero values in Gamma...only Beta. You can figure out what these set points are if you download data on a regular basis and pay attention to the missing hours. It takes a bit of work to do, and the set points can and do change from year to year and from site to site. I believe they are trying to slowly condition us to the higher readings, as that will become the norm of the future, after a few more big accidents.

(12-11-2016, 12:28 AM)matt Wrote: A word on Beta.
It is pretty much discontinued on the west coast now.
They claim cell tower interference. I think they have their scales in an uncalibrated state, judging by the zero readings that show up.
Beta is tough to control as you have that bottom line that doesnt move. Gamma floats and is easy to manipulate.
Big jumps up and down are super common on most sites. Makes you wonder...

Basically what I have seen is a month long hit in the summer of 2011, a pause, another bigger hit in early 2012.  With so many blackouts, it really makes me wonder if we haven't had a bunch of melts since 311.

With all the beta testing being discontinued, as it was in Denver early on, I wonder what particles are floating around.  Storms in Seattle usually take two - three days to reach Denver so we get the Fuku stuff with a dash of Hanford vapor.  With the EPA radnet in such sorry state we won't get early warning of any major radiation releases that might occur.  Russia can't afford another meltdown to become public and China wouldn't say a word if something happened.  Radnet is just political propaganda to reassure the public; not a useful scientific tool to measure radiation contamination.

Denver most likely still gets alot of crap blowing in from the old Rocky Flats Plutonium Complex that has now been "decommisioned".
It also is higher up in elevation, and that in itself also raises the values from what I have seen.

(12-17-2016, 05:11 AM)piajensen Wrote: Ya know... I'm wondering if, since the rad stations are manned by "volunteers," some entrepreneurial, well heeled sort couldn't take the job of maintaining the system. 

It'd be really cool if, say, someone or a consortium out of Silicon Valley pooled their talent and resources to keep US residents informed.

I think the maintenance excuse was done to cover their not wanting to show us the data. Typically, they change out the monitors filter twice a week. This shows up on the beta graph really nicely, as a very sudden drop, and is typically on, or was on, a three day four day schedule, twice a week. It typically happened on a Monday and Thursday in the more recent years, and a two day five day schedule before that.
I saw NO evidence that this maintenance was NOT done.
 
Reply
#35
(02-05-2017, 05:38 PM)matt Wrote:
(12-17-2016, 05:11 AM)piajensen Wrote: Ya know... I'm wondering if, since the rad stations are manned by "volunteers," some entrepreneurial, well heeled sort couldn't take the job of maintaining the system. 

It'd be really cool if, say, someone or a consortium out of Silicon Valley pooled their talent and resources to keep US residents informed.

I think the maintenance excuse was done to cover their not wanting to show us the data. Typically, they change out the monitors filter twice a week. This shows up on the beta graph really nicely, as a very sudden drop, and is typically on, or was on, a three day four day schedule, twice a week. It typically happened on a Monday and Thursday in the more recent years, and a two day five day schedule before that.
I saw NO evidence that this maintenance was NOT done.
I'm curious about the occasional directives I come across from the higher ups, not the actual rad station operators so much (since we don't actually hear directly from them)... I should have been more clear about that, but, have typing issues. 

See: 2 Jan 2014 Follow-Up on OIG Report 12-P-0417, Weaknesses in EPA’s Management of the Radiation Network System Demand Attention Project No.OPE-FY14-0010 https://www.epa.gov/sites/production/fil...system.pdf 

Excerpt from page 3:

Recommendations in Report No. 12-P-0417
We recommend that the Assistant Administrator for Air and Radiation:
1. Establish and enforce written expectations for RadNet (Radiation Network) operational readiness
commensurate with its role in and importance to EPA’s mission. Include, at a minimum:
a. Percentage of stationary monitors expected to be operational.
b. Maximum length of time stationary monitors are permitted to be nonoperational.
c. Plan for temporarily backing up broken stationary monitors when operational status is lower than
required.
d. Availability of monitor operators.
2. Implement metrics for RadNet operational readiness to be reviewed daily by NAREL
(National Air and Radiation Environmental Laboratory), and periodically by OAR (Office of Air and Radiation) (at least monthly) and by the Deputy Administrator (as needed), to include, at a minimum:
a. Percentage of monitors operational.
b. Length of time in nonoperational status.
c. Need for backup monitors when operational status is too low.
d. Operator availability.
3. Direct that NAREL improve planning and management for RadNet to include, at a minimum:
a. Provide for in-stock spare parts to assure operational status established under recommendation 1.
b. Implement measures to assure that operators are available.
c. How often filter changes are needed to provide consistency in throughput at NAREL’s analytical
laboratory and implement a metric for these filter changes.
We recommend that the Assistant Administrator for Air and Radiation, in conjunction with the Assistant Administrator for Administration and Resources Management:
4. Require follow-on RadNet contracts to include incentives/disincentives and a requirement for MPRs (monthly progress reports).
5. Require the CO (contracting officer) and COR (contracting officer representative) to formally
evaluate RadNet contractors’ performance on an annual basis and enter information into PPIRS
(Past Performance Information Retrieval System) through CPARS
(Contractor Performance Assessment Reporting System).
6. Determine whether domestic contract options are available for crucial repair parts that are identified
as only being available from a foreign subcontractor.
7. Review the information in MATS (Management Audit Tracking System)
for the prior audit and ensure it is accurate and current. 
Pia
just pm me if needed.
 
Reply
#36
Here's the Oct 2016 EPA Vendors Data in xls format for download - RadNet is included - the EPA total obligation for all contracts is more than $4 Billion.
Active Contracts by Contract Number as of 10/6/2015 ... 10, Contract No, Vendor, AAship (Program Office), NAICS - Title, Award Type, Project Officer, SPF Site? Vendor ... 3/31/2004, 6/10/2016, 6/10/2016, $121,891,533, $135,796,338 ...... OAR, 811219 - MAINTENANCE, REPAIR AND CALIBRATION OF RADNET AIR.

And just a few screen shots of the first batch of contracts. My immediate impression - there's a whole lotta money being spent on a program that provides so little return on investment. 
   
   
   
Pia
just pm me if needed.
 
Reply
#37
I searched that epa spreadsheet for 'radnet' and only found 2 entries.

   
   
"The map is not the territory that it is a map of ... the word is not the thing being referred to."
 
Reply
#38
(02-06-2017, 08:40 AM)Horse Wrote: I searched that epa spreadsheet for 'radnet' and only found 2 entries.

That is correct - note I said 
Quote:RadNet is included - the EPA total obligation for all contracts is more than $4 Billion

RadNet's total is just over $10 million. I find the $4 billion contract obligations to be wholly inappropriate.
Pia
just pm me if needed.
 
Reply
#39
(02-06-2017, 11:02 AM)piajensen Wrote: RadNet's total is just over $10 million. I find the $4 billion contract obligations to be wholly inappropriate.

Obviously radiation detection is not a major concern to the powers that be or they're wasting money everywhere else.  Military detection systems have been in place since atomic bomb testing.  Detection equipment at nuclear power plants gave us first notice of Chernobyl and Fukushima radioactive clouds, not the civilian RadNet.  MIC data can be adjusted for public consumption.  From what I've seen RadNet is not meant to inform the public and the media fed public won't worry about the MIC's radiation contamination.  Don't worry, it's all safe as an x-ray, don't waste money on waste; make a profit.   The hazards of man-made nuclear products are hidden behind the delay in noticeable effects and background radiation.  It always seemed to me that background radiation only meant it was yesterday’s old radiation settled into the tested environment.  That background noise prevents us from hearing any new radiation unless it climbs above that noise.  I don't like the term natural radiation.  It is a natural process; but we live some distance and shielding from underground and cosmic sources.  Neutron radiation is natural and it kills.  If an atom is radioactive it's giving off energy whether it's in the ground, on the surface, or above in space.  Shouldn't we consider the size of that atom as well as the speed and frequency of the energy when looking for cellular effects?  A BB doesn't hurt like a 38 shell.  Uranium in the ground gets mined and concentrated; that to me makes it man-made. Radiation at the surface of our living planet is just as dangerous to us as that cosmic radiation out in space is to the astronauts.  For our safety we know nuclear forces have to be contained; we struggle to manage it when it escapes.  We should have readily available sensitive measuring equipment to see the full spectra of alpha, beta, and gamma to better identify the radioactive elements and data for everyone to see if we're to try to live in an increasingly radiologic environment.  EPA's RadNet is not doing the job.  Perhaps it's not meant to.
"The map is not the territory that it is a map of ... the word is not the thing being referred to."
 
Reply
#40
I think you are correct - regardless of how much is spent, from MIC to civilian radiation monitoring, there is no intention of informing the public about the scientific truth - if the public were well informed, the feds would not be able to manage the fallout, so to speak. Migration away from contaminated sites would occur, construction companies building homes near contaminated sites would go belly up, more people would make greater demands for accountability at superfund and brownfield sites, more sites would have to be re-categorized and swift results would be demanded. The more than 600 contractors would come under fire for poor performance and delayed outcomes. Legislators would be called to task about why their state under-performs in protecting their constituents... the feds could never keep up with the regulatory chaos that would ensue.
Pia
just pm me if needed.
 
Reply
  


Forum Jump:


Browsing: 3 Guest(s)