Doom Journalism has become Fake News

(GLOBALINTELHUB.COM) — 7/22/2017 Dover DE — The internet is exploding with ‘reports’ of ‘potential cataclysms’ including but not limited to erupting super volcanoes, EMP attack from North Korea (or Nuclear attack from North Korea), or just plain good old fashioned American insurrection (which all should note, will void most insurance policies).  The research team at Global Intel Hub has been investigating global intelligence since 2001 (It was 9/11 that started a chain of events that would lead to the formation of Global Intel Hub).  We research any topic of interest, that can have implications on the financial markets, global business, and possibly most importantly our health & security (that means low probability high impact events that could greatly negatively impact our safety or health, such as an EMP or Avian Flu Pandemic).

The US Government has become the largest government organization possibly in the history of humans on planet Earth.  As a percentage of global GDP, in terms of employed individuals, money spent, spheres of influence – however you look at it, it’s an octopus.  It’s their job to worry about such things and prepare for it, for their customers, the US taxpayer (ahem).  So we need to look at a few angles here, as this is a complex issue.

Angles (not related to James Angleton)

One(Strategy/Tactics) “Use it or lose it” is a Modus Operandi for the US Government.  For example, many budgets have rules such that, for example, there is $2 Million per year allowed for computers, but if it’s not needed, then it’s not given.  So the agency has an incentive to replace computers each year even if it’s not necessary.  Thousands of such illogical rules are baked into the ‘broken system’ for various reasons.  This applies to programs as well, not only budgets.  So for example DHS doesn’t have any problem justifying its R&D projects to stop terrorism, but things like NASA, Food Stamps, and thousands of other good programs are constantly struggling to justify their existence.  During the cold war, billions were spent on Doomsday scenario infrastructure including underground bunkers, tunnel networks, alternative communications systems (which now is the internet).  They have all this, and FEMA is the stepchild of this big brother thinking – if you want to call FEMA the Doomsday agency, it would be appropriate (although practically FEMA is really a welfare agency).  So yes it’s true, the US Government spends billions planning for the end of days – but don’t read between the lines – it’s not as if they ‘know’ something that the general population doesn’t.  And they aren’t going to create an apocalypse just to live in an underground city with Dick Cheney’s relatives (who would want that, really).

Two(Money) The ‘doom’ budget is a fraction of a fraction of the overall budget, so while it is big in dollar terms, it’s certainly not a priority.  And don’t forget, the ‘doom’ budget includes a host of keep-the-US-open provisions on a number of levels, such as the oil reserve.

Three(Super Intelligence) The US Government is possibly the first type of global group Artificial Intelligence, as it appears to have an intelligence of its own.  As such, it has created a number of mechanisms to ensure at the least, its own survival.  Insurrection, Nuclear War, Super Volcanoes, Meteor Strike, and hundreds of other end game scenarios have all been gamed out in New Mexico and when the result is above a statistically significant level, they make a plan (preparation) for it.  There’s thousands of threats that haven’t been prepared for, because they are so remote.  But let’s say there’s a one in a billion chance our Sun can explode, sucking the planet into a mini-black hole that would be created in the aftermath.  Not a lot we could do there, so what’s the point of preparing for it?  This is the thinking.

Four(Non-Human Technology) There is one huge massive gigantic glaring super-significant issue that most of the ‘prepping’ community has missed and either is afraid to address or has not the intellectual rigor to test the reality of the significance of their movement.  What that means?  It means that – there can be other explanations for USG activities, it means there can be other realities, that will explain the whys and hows of how the ‘broken system’ in fact works.  That issue is Aliens.  Let’s first talk about logic and rhetoric.  All the supposed ‘evidence’ the conspiracy crowd has about the impending apocalypse is circumstantial, compared to the evidence about Alien involvement in planet Earth.  The UFO/Alien phenomenon has facts and evidence which is several sigmas more significantly significant than any ‘report’ issued about a Super Volcano that’s about to blow.  If even a fraction of information pertaining to UFO/Aliens is accurate, or even in the same ballpark, this information MUST BE considered in relation to the ‘doom’ scenarios painted by Doom Journalists regarding the chance of them happening and how such scenarios must play out.  Aliens related or not, USG has a tunnel network built deep in the ground under USA that you could say is like a parallel universe – this network is comprised of a number of bunkers, bases, and you can even call them cities ready to be populated in the event the surface is unlivable.  While the obvious Military reason for this network is to keep continuance of government and command of the Military during a Nuclear war situation, it would also serve the same purpose for many of the purported end of game scenarios suggested by these Doom Journalists such as the Super Volcano, EMP, “Planet X,” and a number of other events.

Let’s let facts lie where they may.  There’s no question, that Earth has gone through many cataclysms, many of which could have destroyed all life on the planet at once (or in a matter of days).  Will this happen in our lifetime?  Who knows, there’s not really enough data to calculate, because even the statistics used are based on facts which are based on ‘known’ events, there can be many ‘unknown’ events and unknown factors making statistical analysis irrelevant.

There is ONE BOOK that ALL PREPPERS and wanna-be Doom Journalists need to read, in case they haven’t already, which has laid the foundation for alt-news people like Alex Jones and an entire generation of conspiracy-ers is: Report from Iron Mountain

This book is itself shrouded in mystery.  Which leads to point five.

Five(Cultural Cultivation by Design) In case you are not aware, the USG is actively involved in cultivating our culture, via a number of agencies but most notably, the CIA.  With few exceptions – when there’s a popular uprising, or any form of ‘pop culture’ – there seem to be deep CIA ties.  Whether they are ‘created’ from the beginning, or they are contacted and ‘cultivated’ when they reach a certain level, remains to be understood by the public at large.  What is clear though, there is a hidden hand that controls all media, entertainment, music, movies, magazines, books, name it.  If you think this is far fetched, checkout this latest well researched article, here: Documents Expose How Hollywood Promotes War On Behalf Of The Pentagon, CIA, & NSA:

These documents for the first time demonstrate that the US government has worked behind the scenes on over 800 major movies and more than 1,000 TV titles.

Some topics are obvious, such as war movies, or movies that might promote the military, or show how the CIA is good and noble for example.  But what’s more enlightening and perhaps more disturbing are the more en passant, subtle movement, such as InQTel, the VC arm that invests in Silicon Valley projects, most notably Google and Facebook.

So, this article is not an expose on how the CIA is involved in pop culture.  However, based on its active involvement on a number of varying levels in any popular movement (even the ‘counterculture’, such as was pioneered in the 60s by CIA agent Tim Leary and others), it is reasonable to assume that the USG on a number of levels is involved in the prepper movement.  In fact, it’s more rational than any other domestic meddling- because many of these ‘movements’ involve plans to exist and survive after the USG ‘collapses’ (for a number of reasons) – and intelligently noted in the prepper community, after going through all this hassle of storing gold, guns, toilet paper and thousands of other supplies to last you and yours until the next millennium, what’s to stop a hungry Army battalion armed with Apache helicopters and high powered air to surface missiles (and a whole fun house of other weapons they’ve been storing for a ‘rainy day’) from literally raining on your parade with bullets and seizing your goods for ‘the good of the government.’  They say that the only real currency is ‘accelerated lead’ that means that those who hold Gold are foolish to think that those who hold weapons will not take it from them by force, if they choose.  So it leads back to who is the world’s biggest prepper – and the answer is simple – it’s the US Military, which is very well funded, on all continents, operates a secure internet, thousands of blast proof sites, nuclear bunkers, and basically every ‘prepping’ mechanism one can think of – because that’s what the Army does, they prepare for war.  And when there is no war, they train and prepare.  So effectively, if you are a prepper, you are really a revolutionary, preparing to go head to head with the largest military in the world, should a Super Volcano erupt.  Basically, preppers are the modern ‘anarchists’ which the FBI was created in order to reign in.  If any of these ‘preppers’ feel as if they are ‘hiding’ deep in the wilderness of Montana or Idaho, they need to look into public technology that allows anyone with an internet browser to collect gobs of data on just about any individual or group inside the USA without accessing private networks (which the USG certainly has access to).  So, their dreams are foolish.

So now that prepping has gone mainstream, and we’re seeing the trendy billionaire class have at it, you can bet the CIA is there.  Whether they are in the ‘monitoring’ phase or the ‘cultivation’ phase, would be difficult to determine in real time (it’s hard to determine even in the past with lots of documents how ‘involved’ they were in various important cultural events, like the Moon landing and so on) – but it’s safe to say, there is an agenda behind promoting the ‘doom culture’ as it has become popular.  Terrorists are on the run, war with any real country is not a possibility in modern times – so we’re going to have to ‘invent’ a new enemy, like in 1984.  That enemy is simply the unknown.  We really don’t know, if the Volcano in Yellowstone will erupt.  So we better start preparing for it, and hundreds of other things including but not limited to Alien invasion, which could be a great dramatic staged false flag hoax reason to get the population to once again, rally together like we all did after 9/11 to ‘chase them back to their planet’ – if you think this is far fetched just spend the weekend or an entire week reading about USG involvement with Aliens including but not limited to technology transfer to major publicly listed corporations, various military reverse engineered technology, and other significant events – more significant than ‘Russia’ which is the most irrelevant issue Democrats could have possibly picked in their futile attempt to manipulate public opinion.

It’s important that smart people learn about this issue now because it most certainly will be used by the deep state to try to manipulate the people again, but it’s a serious issue – it’s not like this nonsense about Trump and Russia (like who cares if Trump had a Russian partner, the Mayor of New York is Russian but no one bothered to question him about his potential Leninist background??).

Six(Infrastructure) All this doom infrastructure exists and continues to be built, but is largely used for daily routine business ‘on the surface’ – let’s use the best example of the internet itself.  The internet was the first major alternative communication network built originally during World War 2 but the major nodes and cables really were laid in the 1960’s during the peak of the Cold War ‘fears’ – and the internet is the largest part of any post-apocalyptic infrastructure because, communication is the foundation for any civilization.  It’s the difference between a post-event civilization and anarchy, such as portrayed in the film “The Postman.”  The internet is by design, capable of being ‘cut’ at multiple points and still functioning.  In fact, most reading this are only aware of the world wide web (WWW) on which this article is published, but there are in fact many ‘internets’ (George W Bush made a lapsus mentis grammatical error in his famous popular mistake saying ‘internets’ as he was logically correct) that run in parallel to the main internet (or WWW), such as SIPRnet:

According to the U.S. Department of State Web Development Handbook, domain structure and naming conventions are the same as for the open internet, except for the addition of a second-level domain, like, e.g., “sgov” between state and gov:[3] Files originating from SIPRNet are marked by a header tag “SIPDIS” (SIPrnet DIStribution).[4] A corresponding second-level domain exists for DoD users.[5] According to the Pentagon, SIPRNet has approximately 4.2 million users.[6] Access is also available to a “…small pool of trusted allies, including Australia, Canada, the United Kingdom and New Zealand…”[7] This group (including the US) is known as the Five Eyes.  SIPRNet was one of the networks accessed by Chelsea Manning, convicted of leaking the video used in WikiLeaks‘ “Collateral Murder” release[8] as well as the source of the US diplomatic cables published by WikiLeaks in November 2010.[9]

So it is no surprise, that the CIA and Military in general plays a big role in publicly used networks like Google and Facebook.  The internet was built in the 1960’s and was made public in the 1990’s – maybe in 20 or 30 years we’ll be travelling from Atlanta to Los Angeles via underground high speed rail network?

Seven(Synthesis, Integration) Part of the genius of the AI entity we call US Government (USG) is how it integrates in society and in ‘the system’ currently Capitalism.  If we observe the Military design once again, it is compartmentalized in different states, the idea being that one state, will not be so powerful because they control the building of Nuclear Missiles, for example, and might use this power against a rival state.  Parts for Nuclear Submarines are made in many places, assembled in yet another place.  The entire USG is designed like this.  How it works socially – the USG functions as a huge customer for corporate America.  So when we speak of the CIA, the content of what the CIA really is in operation, about 60% of operations are outsourced to corporations, both public and private.  Even public companies, such as Raytheon (RTN) have contracts with the CIA to provide various services.  So when we’re talking about the USG, it’s not just a ‘government’ like your local city council.  It’s a super-super state, a real super-entity.  Whether this happened by evolution or by intelligent design is not really a relevant question (because the answer can be one in the same, i.e. God created evolution – that’s the answer Creationists need).  The important aspect to understand is how ingenious this is.  It virtually guarantees it’s own success because all parts of the whole have a vested interest in seeing the system continue, although in a slightly evolved form.  That means USG encourages protests, special interests, and even allows foreign influence (Israel, for example) making USA the de-facto one world government.  What USG will punish with impunity is insurrection; making any attempt to overthrow the USG and establishing a different, alternate form of government and economy virtually impossible.  Because any remote group would quickly be observed by the NSA and other surveillance apparatus (this was the failure of HAL – ‘just what do you think you’re doing Dave’ wasn’t enough to stop Dave from decommissioning HAL).  The apparatus is so many generations ahead of any potential threats to its existence the NSA would probably know about the first meeting of a planned revolution before it happened (as now everything is electronic, and these ‘ideas’ can only come from certain sources which are tracked, such as portrayed poorly in the film “Conspiracy Theory” with Mel Gibson checking out the book “Catcher in the Rye” – the real book probably Report from Iron Mountain however, one must understand the thinking here.  By cultivating pop culture, and by cultivating a ‘revolutionary’ is the real way to stop any real existential threat, what the USG learned in the 1960’s.  That means foolish youths who may have such tendencies are molded and shaped from their beginnings, that they end up becoming corporate slaves, and not the Unabomber.

Preppers, meaning real preppers, are largely outcasts of the system, that means with the exception of Billionaire preppers (who aren’t really preppers, for them buying a bunker is just another luxurious indulgence), preppers are the modern day hippie – they are ‘the fringe’ – and since the 1950’s especially, the CIA and others (FBI too) has been monitoring and actively involved with fringe elements.  That means how do we know – who is the real prepper, and who is the PLANT.  How do we know, if Alex Jones is really a CIA asset, or the hundreds of other newer would be Doom Journalists.  In some way they are promoting this new semi-mainstream prepper culture, just by talking about it and giving it significance.  And again, of course the threats of these events are real – there is no question about that.  But if a Tsunami wipes out 99% of the population of the East Coast of America due to a Super Volcano erupting on the Canary Islands such as Cumbre Vieja,what could one really do about it?  As we’ve outlined, if you were able to survive such a widespread devastating event in the mountains in Colorado in your own private bunker, you’d have to fight the USG and the Military in the aftermath, so you should be prepared to fight them or join them.  And most of the preppers are preppers because they either don’t agree with the USG or believe that they are ‘lying’ to us in which case they would be reluctant to be part of the ‘join them’ crowd, meaning that preppers are all potential revolutionaries, even if they don’t want to ‘overthrow’ the government, in a doomsday scenario where resources are scarce, they’d be putting themselves in a conflict situation just by surviving such an event.  This is the key logic that 90% of preppers generally haven’t thought of, and it’s a lot more important to think about this ‘plan’ than buying gas masks and other paraphernalia.  In fact the mainstream way that preppers are ‘preparing’ is mostly all wrong; keeping a 5 year food supply and Gold bars and ammunition in your basement is not really a plan.  Of course, that is the mainstream prepper crowd there are hardcore guys, again mostly ex-Military, who understand the realities of a post-event scenario that are really ready to survive in the wild.  They know the simple truth about War – combat is more about camping and survival in the wild than firing bullets.

For those who are skeptical about this intelligence analysis, first they should read this book which is quite dated, which is the foundation for the modern conspiracy movement, which is largely controlled by the same elements.  Call it the shadow government, call them think tanks, call it Adam Smith’s ‘hidden hand’ – these are the forces behind the Doom Culture.

MUST READ: Report from Iron Mountain

Remember one final thing – the bosses and owners of all this system live in Del Mar, California and Boca Raton, Florida.  They don’t want to spend the rest of their lives eating MRE’s deep underground and using self-wiping toilets.  There’s a vested interest in all to keep things functioning normally on the surface, and that’s what they are paid to do.  Part of the Doom government exists completely to plan and protect the population from such extreme events.  If they succeed, then there’s no reason to prepare to the extent that many do.  And if they fail, you can bet they are coming after your food supply first!  So keep this in mind when reading the Doom Journalism that has become popular today.

Article published 7/23/2017 @

AD: Buy your preps online @ PLEASEORDERIT.COM/PREPPERS 

German Intelligence Clears Russia Of Election Interference

Submitted by Ray McGovern via The Strategic Culture Foundation,

After a multi-month, politically charged investigation, German intelligence agencies could find no good evidence of Moscow-directed cyber-attacks or a disinformation campaign aimed at subverting the democratic process in Germany. Undaunted, Chancellor Angela Merkel has commissioned a new investigation.

Last year, Berlin’s two main intelligence agencies, the BND and BfV (counterparts of the CIA and FBI) launched a joint investigation to substantiate allegations that Russia was meddling in German political affairs and attempting to shape the outcome of Germany’s elections next September.

Like the vast majority of Americans malnourished on “mainstream media,” most Germans have been led to believe that, by hacking and “propaganda,” the Kremlin interfered in the recent U.S. election and helped Donald Trump become president.

German intelligence agencies rarely bite the hand that feeds them and realize that the most bountiful part of the trough is at the CIA station in Berlin with ultimate guidance coming from CIA headquarters in Langley, Virginia. But this time, in an unusual departure from past practice, analysts at the BND and BfV decided to act like responsible adults.

Whereas former CIA Director John Brennan prevailed on his analysts to resort to anemic, evidence-light reasoning “assessing” that Russia tried to tip the U.S. election to Donald Trump, Berlin’s intelligence agencies found the evidence lacking and have now completed their investigation.

Better still, the conclusions have been reported in a mainstream German newspaper, Sueddeutsche Zeitung, apparently because a patriotic insider thought the German people should also know.

Lemmings No Longer?

If BND President Bruno Kahl thought that his own analysts could be depended upon to follow their American counterparts lemming-like and find evidence – Curveball-style – to support the U.S. allegations, he now has had a rude awakening.


When the joint investigation was under way with his analysts doing their best to come up with reliable evidence of Russian perfidy, Kahl had behaved like his BND predecessors, parroting the charges made by his CIA counterpart, that the Russians were fomenting uncertainty and instability in Germany and elsewhere in Europe.

In a rare interview with the mainstream newspaper, Sueddeutsche Zeitung, on Nov. 28, 2016, Kahl went out on what he probably thought was a safe limb, denouncing subversive “interference” by the Russians (“as they did in the U.S.”). He was just a few months into his job and may have been naïve enough to consider what John Brennan said as gospel truth. (If he really is that gullible, Kahl is in the wrong profession.)

In the interview, Kahl played the puppet-doll Charlie McCarthy with Brennan in the role of Charlie’s ventriloquist Edgar Bergen. Kahl told the Sueddeutsche that he agreed with the U.S. intelligence “assessment” that the Kremlin was behind the cyber attacks aimed at influencing the U.S. election.

He added: “We know that cyber attacks are taking place and that they have no purpose other than to produce political instability. … Not only that. The perpetrators are interested in delegitimizing the democratic process itself. … I have the impression that the outcome of the American election has evoked no sadness in Russia so far. …

“Europe is [now] the focus of these disruption experiments, and Germany especially… The pressure on the public discourse and on democracy is unacceptable.” Sound familiar?

Still, one might excuse the novice BND president for assuming his analysts would remember which side their bread is buttered on and follow past precedent in coming up with conclusions known to be desired by their masters in Berlin and the CIA.

So it must have come as an unwelcome surprise to Kahl when he found out that, this time, BND analysts would stand on principle and refuse to be as malleable as their Washington counterparts. His analysts could find no proof that the Kremlin was working hard to undermine the democratic process in Germany, and said so.

Worse still from the U.S. point of view, the two German intelligence agencies resisted the usual pressure from some senior leaders in Berlin (perhaps including Kahl himself) to jam whatever innocuous information they could find into the anti-Russian mosaic that Washington was constructing, a kind of Cubist version of distorted reality.

And So, a Do-Over

So, what do powerful officials do when the bureaucracy comes up with “incorrect” conclusions? They send the analysts and investigators back to work until they come up with “correct” answers. This turned out to be no exception. Absent evidence of hacking directed by the Kremlin, the Germans now have opted for an approach by which information can be fudged more easily.


According to the Sueddeutsche, “Chancellor Merkel’s office has now ordered a new inquiry. Notably, a ‘psychological operations group’ jointly run by the BND and BfV will specifically look at Russian news agencies’ coverage in Germany.” We can expect that any articles that don’t portray Vladimir Putin in a devil’s costume will be judged “Russian propaganda.”

For guidance, Merkel may well give the new “investigators” a copy of the evidence-free CIA/FBI/NSA “Assessment: Russia’s Influence Campaign Targeting the 2016 US Presidential Election.” Released on Jan. 6, the report was an eyesore and embarrassment to serious intelligence professionals. The lame “evidence” presented, together with all the “assessing” indulged in by U.S. analysts, was unable to fill five pages; filler was needed – preferably filler that could be made to look like analysis.

And so, seven more pages were tacked onto the CIA/FBI/NSA Assessment, even though the information presented in them had nothing to do with the cause celebre of Russian hacking. No problem: The additional seven pages bore the ominous title: “Annex A: Russia – Kremlin’s TV Seeks To Influence Politics, Fuel Discontent in US.”

The extra pages, in turn, were then used to support the following indictment: “Russia’s state-run propaganda machine contributed to the influence campaign by serving as a platform for Kremlin messaging to Russian and international audiences.”

Did an Insider Leak?

It is not clear how the German daily Sueddeutsche acquired the conclusions of the joint investigation or even whether it has the full 50-page copy of the final report. The newspaper did make it clear, though, that it now realizes it was played by Kahl with his unsupported accusations last November.


From what the newspaper was told, the analysts seemed willing to give the boss what he had already declared to be his desired conclusion, but the evidence simply wasn’t there. The article quotes one security expert saying, “We would have been happy to give Russia a yellow card,” a soccer metaphor referring to improper conduct. A cabinet source lamented, “We found no smoking gun.”

Initially, the BND and BfV planned to release excerpts of their still classified inquiry, the Sueddeutsche reported, but it’s now not clear when, if ever, the full report will be released.

The day after the Sueddeutsche story appeared, some other media outlets reported on it – briefly. Newsweek and Politico gave the scoop all of three sentences each. Not fitting with the preferred “Russia-is-guilty-of-everything” narrative, it then died a quick death. I have been unable to find the story mentioned at all in major U.S. “mainstream media” outlets.

If Americans became aware of the story, it was probably via RT – the bête noire of the abovementioned CIA/FBI/NSA report condemning Russian “propaganda.” Can it become any clearer why RT America and RT International are despised by the U.S. government and the “mainstream media?” Many Americans are slowly realizing they cannot count on American network and cable TV for accurate news and are tuning in to RT at least for the other side of these important stories.

*  *  *

It was from a early morning call from RT International that we first learned of the Feb. 7 Sueddeutsche Zeitung report on Germany’s failed hunt for evidence of Russian electoral interference.

Polar Vortex to Plunge Toward Central US Next Week

The atmosphere is preparing to send part of the polar vortex southward toward the United States next week with an outbreak of arctic air and lake-effect snow.

The polar vortex is a large pocket of very cold air, typically the coldest air in the Northern Hemisphere, which sits over the polar region. Occasionally, this pocket of very cold air can get dislodged farther south than normal, leading to cold outbreaks in Canada and the U.S.

The main blast of cold air associated with the plunging polar vortex will swing southeastward into the Central and Eastern states spanning Sunday, Nov. 9, to Friday, Nov. 14, 2014.

According to AccuWeather Long Range Expert Paul Pastelok, “Areas from the northern and central Plains to the Great Lakes, the upper Gulf Coast and the Appalachians will feel significant impact from the arctic outbreak.”

The worst of the cold will be felt from Fargo, North Dakota, and Minneapolis to Chicago and St. Louis beginning on Tuesday and continuing through the middle and latter part of the week. Heating demands will jump and people will be reaching for winter coats before venturing outdoors.

Chilly days and cold nights will be felt farther along in the South and along the Atlantic Seaboard late in the week and into next weekend.

The cold air will be accompanied by gusty winds, which will send AccuWeather RealFeel® temperatures plunging into the single digits and teens in the north and into the 20s and 30s in part of the Deep South. Such cold will raise the risk of hypothermia and frostbite in the North and will make it uncomfortable for some outdoor activities in the South.

“It is possible single-digit low temperatures occur in parts of the northern Plains and the Upper Midwest, away from the Great Lakes with temperatures plunging to 20 degrees below average in parts of the South,” Pastelok said.

While the brunt of the cold air is not likely to be directed toward the I-95 corridor of the East, a breeze accompanying the cold air will make if feel more like December for at least a few days.

RELATED: 2014-2015 US Winter Forecast
Monster Storm in Alaska to Signal Changing Pattern for US
Winter 2014-2015 to Yield Lower Heating Oil Costs in Northeast

The action of cold air passing over the relatively warm waters of the Great Lakes will unleash bands of lake-effect flurries, snow and squalls from the Upper Midwest to the interior Northeast.

While snowfall can be heavy in a pattern such as this, it tends to be highly localized.

“A broad area of snow on the ground would result in significantly lower temperatures,” Pastelok said.

The bare ground and warm Great Lakes waters will take away some of the severity of this early season arctic outbreak.

“Despite the moderation, freezing temperatures are possible along the upper Gulf Coast from northeast of Houston to northern Florida late next week,” Pastelok said.

For folks not ready for winter, there is some indication that the cold weather will ease prior to Thanksgiving.

The weather pattern forecast to send the polar vortex on a southward plunge can be traced back to the western Pacific Ocean, where Typhoon Nuri curved east of Asia earlier this week.

According to Senior Meteorologist Brett Anderson, “In brief, when a typhoon curves away from Asia it causes the jet stream [steering winds] farther to the east across the Pacific and into North America to buckle and amplify days later.”

The contorted jet stream can lead to major storms and big surges of warmth and outbreaks of cold air.

Prior to the plunge of the polar vortex, the pattern will give birth to a powerful storm over the northern Pacific Ocean. This storm, partly associated with the former typhoon is likely to slam the Aleutians and west coast of Alaska with high winds, huge waves and heavy precipitation spanning Friday into this weekend.

CDC Issues Level 3 Travel Alert As ‘Largest Ebola Outbreak In History’ Spreads

Things appear to be going from worse to worst as the deadly Ebola epidemic surges on. The CDC has issued a Level 3 – Avoid All Non-Essential Travel – warning.

CDC urges all US residents to avoid nonessential travel to Liberia, Guinea, and Sierra Leone because of an unprecedented outbreak of Ebola.

An outbreak of Ebola has been ongoing in Liberia since March 2014.



This outbreak also affects Sierra Leone and Guinea; to date more than 1320 cases have occurred in the three countries and more than 725 people have died, making this the largest outbreak of Ebola in history.


At least three Americans have been infected; two are health care workers in an Ebola clinic. Affected districts include Bomi, Bong, Grand Gedeh, Lofa, Montserrado (including the capital city of Monrovia), Margibi, and Nimba.


Instances of civil unrest and violence against aid workers have been reported in West Africa as a result of the outbreak.


The public health infrastructure is being severely strained as the outbreak grows.


CDC to send additional 50 experts to region in next 30 days.


Dr. Thomas Frieden, director of the Centers for Disease Control and Prevention, said Thursday that even in a best-case scenario, it could easily take three to six months to stem the epidemic in West Africa.

Do not worry though, as the US reassures there is no risk…


NASA-funded study: industrial civilisation headed for ‘irreversible collapse’?

GIH: Can income inequality actually be a major factor, if not the primary cause, of complete civilization collapse?  It can, according to a NASA funded study.  While many see the struggle of haves and have nots as ethical, or somehow sociological, actually it can be about survival for our species.  In a similar way how cancer kills a human being, any entity which over consume resources in an environment can kill the host, whether it be planet earth, a civilization, or the markets.

A new study sponsored by Nasa’s Goddard Space Flight Center has highlighted the prospect that global industrial civilisation could collapse in coming decades due to unsustainable resource exploitation and increasingly unequal wealth distribution.

Noting that warnings of ‘collapse’ are often seen to be fringe or controversial, the study attempts to make sense of compelling historical data showing that “the process of rise-and-collapse is actually a recurrent cycle found throughout history.” Cases of severe civilisational disruption due to “precipitous collapse – often lasting centuries – have been quite common.”

The research project is based on a new cross-disciplinary ‘Human And Nature DYnamical’ (HANDY) model, led by applied mathematician Safa Motesharri of the US National Science Foundation-supported National Socio-Environmental Synthesis Center, in association with a team of natural and social scientists. The study based on the HANDY model has been accepted for publication in the peer-reviewed Elsevier journal, Ecological Economics.

It finds that according to the historical record even advanced, complex civilisations are susceptible to collapse, raising questions about the sustainability of modern civilisation:

“The fall of the Roman Empire, and the equally (if not more) advanced Han, Mauryan, and Gupta Empires, as well as so many advanced Mesopotamian Empires, are all testimony to the fact that advanced, sophisticated, complex, and creative civilizations can be both fragile and impermanent.”

By investigating the human-nature dynamics of these past cases of collapse, the project identifies the most salient interrelated factors which explain civilisational decline, and which may help determine the risk of collapse today: namely, Population, Climate, Water, Agriculture, andEnergy.

These factors can lead to collapse when they converge to generate two crucial social features: “the stretching of resources due to the strain placed on the ecological carrying capacity”; and “the economic stratification of society into Elites [rich] and Masses (or “Commoners”) [poor]” These social phenomena have played “a central role in the character or in the process of the collapse,” in all such cases over “the last five thousand years.”

Currently, high levels of economic stratification are linked directly to overconsumption of resources, with “Elites” based largely in industrialised countries responsible for both:

“… accumulated surplus is not evenly distributed throughout society, but rather has been controlled by an elite. The mass of the population, while producing the wealth, is only allocated a small portion of it by elites, usually at or just above subsistence levels.”

The study challenges those who argue that technology will resolve these challenges by increasing efficiency:

“Technological change can raise the efficiency of resource use, but it also tends to raise both per capita resource consumption and the scale of resource extraction, so that, absent policy effects, the increases in consumption often compensate for the increased efficiency of resource use.”

Productivity increases in agriculture and industry over the last two centuries has come from “increased (rather than decreased) resource throughput,” despite dramatic efficiency gains over the same period.

Modelling a range of different scenarios, Motesharri and his colleagues conclude that under conditions “closely reflecting the reality of the world today… we find that collapse is difficult to avoid.” In the first of these scenarios, civilisation:

“…. appears to be on a sustainable path for quite a long time, but even using an optimal depletion rate and starting with a very small number of Elites, the Elites eventually consume too much, resulting in a famine among Commoners that eventually causes the collapse of society. It is important to note that this Type-L collapse is due to an inequality-induced famine that causes a loss of workers, rather than a collapse of Nature.”

Another scenario focuses on the role of continued resource exploitation, finding that “with a larger depletion rate, the decline of the Commoners occurs faster, while the Elites are still thriving, but eventually the Commoners collapse completely, followed by the Elites.”

In both scenarios, Elite wealth monopolies mean that they are buffered from the most “detrimental effects of the environmental collapse until much later than the Commoners”, allowing them to “continue ‘business as usual’ despite the impending catastrophe.” The same mechanism, they argue, could explain how “historical collapses were allowed to occur by elites who appear to be oblivious to the catastrophic trajectory (most clearly apparent in the Roman and Mayan cases).”

Applying this lesson to our contemporary predicament, the study warns that:

“While some members of society might raise the alarm that the system is moving towards an impending collapse and therefore advocate structural changes to society in order to avoid it, Elites and their supporters, who opposed making these changes, could point to the long sustainable trajectory ‘so far’ in support of doing nothing.”

However, the scientists point out that the worst-case scenarios are by no means inevitable, and suggest that appropriate policy and structural changes could avoid collapse, if not pave the way toward a more stable civilisation.

The two key solutions are to reduce economic inequality so as to ensure fairer distribution of resources, and to dramatically reduce resource consumption by relying on less intensive renewable resources and reducing population growth:

“Collapse can be avoided and population can reach equilibrium if the per capita rate of depletion of nature is reduced to a sustainable level, and if resources are distributed in a reasonably equitable fashion.”

The NASA-funded HANDY model offers a highly credible wake-up call to governments, corporations and business – and consumers – to recognise that ‘business as usual’ cannot be sustained, and that policy and structural changes are required immediately.

Although the study is largely theoretical, a number of other more empirically-focused studies – by KPMG and the UK Government Office of Science for instance – have warned that the convergence of food, water and energy crises could create a ‘perfect storm’ within about fifteen years. But these ‘business as usual’ forecasts could be very conservative.

Is our Sun falling silent? Prepare for an impending Ice Age

GIH: Climate change is now apparent for every person living on planet earth – it’s no longer just anomalous events, but a complete different climate.  But we have only been keeping records for a short period, historically speaking.  It’s possible that what we are experiencing now is a common cycle, but because of the short record span, there is not enough data to make a valid assessment  of a pattern.  Or, we could be living in an age of abrupt change that is unique to the history of our planet.  Regardless of the cause, we must be aware about it, and adapt accordingly.  Climate change will impact our lives in many ways, the economy, our health, and most importantly, our infrastructure.  If we did live in a period of a moderate climate, we have poorly designed and planned our infrastructure, that may not survive severe change.  Ice storms knock out power grids, solar events can cause a natural EMP that can knock out electronic infrastructure, and a rising sea can cause a problem for 90% of the world population that lives close to the sea (cities such as New York, London, Miami, Tokyo, Los Angeles, and many others, are all in low lying areas on the coast).  Checkout the following data from what gives us life and seasons, the Sun:

“I’ve been a solar physicist for 30 years, and I’ve never seen anything quite like this,” says Richard Harrison, head of space physics at the Rutherford Appleton Laboratory in Oxfordshire.

He shows me recent footage captured by spacecraft that have their sights trained on our star. The Sun is revealed in exquisite detail, but its face is strangely featureless.

“If you want to go back to see when the Sun was this inactive… you’ve got to go back about 100 years,” he says.

This solar lull is baffling scientists, because right now the Sun should be awash with activity.

It has reached its solar maximum, the point in its 11-year cycle where activity is at a peak.

This giant ball of plasma should be peppered with sunspots, exploding with flares and spewing out huge clouds of charged particles into space in the form of coronal mass ejections.

But apart from the odd event, like some recent solar flares, it has been very quiet. And this damp squib of a maximum follows a solar minimum – the period when the Sun’s activity troughs – that was longer and lower than scientists expected.

“It’s completely taken me and many other solar scientists by surprise,” says Dr Lucie Green, from University College London’s Mullard Space Science Laboratory.

The drop off in activity is happening surprisingly quickly, and scientists are now watching closely to see if it will continue to plummet.

“It could mean a very, very inactive star, it would feel like the Sun is asleep… a very dormant ball of gas at the centre of our Solar System,” explains Dr Green.

This, though, would certainly not be the first time this has happened.

During the latter half of the 17th Century, the Sun went through an extremely quiet phase – a period called the Maunder Minimum.

© Science Photo Library
Londoners enjoyed frost fairs on the Thames in the 17th Century

Historical records reveal that sunspots virtually disappeared during this time.

Dr Green says: “There is a very strong hint that the Sun is acting in the same way now as it did in the run-up to the Maunder Minimum.”

Mike Lockwood, professor of space environment physics, from the University of Reading, thinks there is a significant chance that the Sun could become increasingly quiet.

An analysis of ice-cores, which hold a long-term record of solar activity, suggests the decline in activity is the fastest that has been seen in 10,000 years.

“It’s an unusually rapid decline,” explains Prof Lockwood.

“We estimate that within about 40 years or so there is a 10% to 20% – nearer 20% – probability that we’ll be back in Maunder Minimum conditions.”

The era of solar inactivity in the 17th Century coincided with a period of bitterly cold winters in Europe.

Londoners enjoyed frost fairs on the Thames after it froze over, snow cover across the continent increased, the Baltic Sea iced over – the conditions were so harsh, some describe it as a mini-Ice Age.

© Reuters/Reinhard Krause
Cold, snowy winters could become the norm for Europe

And Prof Lockwood believes that this regional effect could have been in part driven by the dearth of activity on the Sun, and may happen again if our star continues to wane.

“It’s a very active research topic at the present time, but we do think there is a mechanism in Europe where we should expect more cold winters when solar activity is low,” he says.

He believes this local effect happens because the amount of ultraviolet light radiating from the Sun dips when solar activity is low.

This means that less UV radiation hits the stratosphere – the layer of air that sits high above the Earth. And this in turn feeds into the jet stream – the fast-flowing air current in the upper atmosphere that can drive the weather.

The results of this are dominantly felt above Europe, says Prof Lockwood.

“These are large meanders in the jet stream, and they’re called blocking events because they block off the normal moist, mild winds we get from the Atlantic, and instead we get cold air being dragged down from the Arctic and from Russia,” he says.

“These are what we call a cold snap… a series of three or four cold snaps in a row adds up to a cold winter. And that’s quite likely what we’ll see as solar activity declines.”

So could this regional change in Europe have a knock-on effect on for the rest of the world’s climate? And what are the implications for global warming?

Comment: The implications for global warming are: THAT IT’S OVER!Solar activity is so low that we may indeed be facing an ice age in the not too distant future:

Sun’s bizarre activity may trigger another ice age

New paper predicts a sharp decline in solar activity until 2100

Falling temperatures are giving climate alarmists chills

In a recent report by the UN’s climate panel, scientists concluded that they were 95% certain that humans were the “dominant cause” of global warming since the 1950s, and if greenhouse gases continue to rise at their current rate, then the global mean temperature could rise by as much as 4.8C.

And while some have argued that ebbs and flows in the Sun’s activity are driving the climate – overriding the effect of greenhouse gas emissions, the Intergovernmental Panel on Climate Change concludes that solar variation only makes a small contribution to the Earth’s climate.

Prof Lockwood says that while UV light varies with solar activity, other forms of radiation from the Sun that penetrate the troposphere (the lower layer of air that sits above the Earth) do not change that much.

He explains: “If we take all the science that we know relating to how the Sun emits heat and light and how that heat and light powers our climate system, and we look at the climate system globally, the difference that it makes even going back into Maunder Minimum conditions is very small.

“I’ve done a number of studies that show at the very most it might buy you about five years before you reach a certain global average temperature level. But that’s not to say, on a more regional basis there aren’t changes to the patterns of our weather that we’ll have to get used to.”

But this weather would not be the only consequence of a drawn out period of inactivity, says Dr Green.

Polar lights – one manifestation of solar activity in the Earth’s magnetosphere – may dim

“If the Sun were to get very quiet, one of the few things that would happen is that we’d have very few displays of the northern lights. They are driven by solar activity, and we’d miss out on this beautiful natural phenomenon,” she explains.

However, there could be positive effects too.

“Solar activity drives a whole range of space weather, and these are ultimately effects on the electricity networks, on satellites, on radio communications and GPS on your sat-nav,” she explains.

And while scientists cannot discount that the random bursts of activity may still occur, calmer periods of space weather would help to maintain the technological infrastructure that we rely so heavily on.

While the full consequences of a quietening Sun are not fully understood, one thing scientists are certain about is that our star is unpredictable, and anything could happen next.

“This feels like a period where it’s very strange… but also it stresses that we don’t really understand the star that we live with.” says Prof Harrison.

“Because it’s complicated – it’s a complex beast.”

Comment: Not complicated enough, if one is to spend time educating oneself on this topic.Here’s what is known to us so far: The surface of the planet was, for a period of time (as reported in numerous stories back some 5/6 years ago) actually warming. There were reports of many hot-spots in various places, in some cases, hot enough to ignite. This is now being countered by surface cooling due to other factors, like global dimming/induction of colder air, etc. This heating of the lithosphere is probably due to the slowing of rotation which generates internal heat between the lithosphere and the mantle, and leads to increased volcanic eruptionsearthquakes,sinkholes, etc, and is probably also responsible for many of the strange sounds.

It is the upper atmosphere – the stratosphere – that is cooling and that is the reason for sun pillars, rings around the sun, double/triple/quadruple images of the sun, contrails, etc.

Volcanic dust does not heat the upper atmosphere by “trapping heat” in the stratosphere. That the stratosphere is colder and has dropped lower has been reported by scientists, though that information gets sidelined. The AGW folks would LOVE people to believe that nonsense.

Then, there is the comet dust/smoke in the upper atmosphere that further contributes to the cooling as you can see from the dramatic increase in noctilucent clouds.

At the same time that is going on, the quiescent sun and the earth’s weakened magnetic field allow more cosmic radiation to reach thetroposphere where it forms cloud nuclei and increases precipitation from the increased evaporation from the oceans caused by the increasing heat within the earth caused by the slowing of rotation.

These are all the conditions for the initiation of an ICE AGE: heat at the lower levels, troposphere where “weather” takes place, extreme cold at the upper levels of the atmosphere, which can then create interesting effects including polar vortices.

Has Fukushima’s Radioactive Wave Already Hit California?

California is one of the largest economies in the world.  Home to major corporations, major shipping ports, the film industry, and a buzzing real estate market, California has a lot to lose.  Many are still in denial about the radioactive plume that reached the shores of California some time ago.  Now, seafood is showing positive for radioactive testing.  As the following report shows, now amateurs with geiger counters are recording readings 10 – 13 times the safe level, setting off their alarms.

From Info Wars:

Health officials confirm spike in radiation on San Francisco beach but have no answers

Predictions that Fukushima’s radioactive ocean plume would hit the west coast of the U.S. sometime in 2014 may have already come to pass, with a new video showing Geiger counter readings of background radiation at a beach in San Francisco over five times the safe level.

Days after a YouTube video emerged showing background radiation at a Coastside beach reaching over 150 micro-REM per hour, Health officials in San Mateo County confirmed the spike but said they were “befuddled” as to its cause.

However, officials dismissed the possibility that the readings could be linked to Fukushima radiation reaching the west coast despite forecasts by experts last summer that radioactive particles from Fukushima would reach U.S. coastal waters in 2014.

The video shows a man measuring radiation readings at different spots on a beach south of Pillar Point Harbor. Background radiation in the areas immediately surrounding the beach are normal, but once the man approaches the water itself, the radiation spikes to at least 500 per cent safe levels and the Geiger counter’s alarm goes off.

The man behind the video claims that on his previous visit to the same beach, radiation readings were 13 times the safe level.

“In the following days, other amateurs with Geiger counters began posting similar videos online,” reports the Half Moon Bay Review. “The videos follow other alarming news last month that starfish were mysteriously disintegrating along the West Coast, a trend that has not been linked yet to any cause.”

The U.S. Environmental Protection Agency and state Department of Public Health are now investigating the cause of the radiation and more information is expected to be released this week.

Image: Extent of Fukushima radiation by 2014. (Rossi et al./Deep-Sea Research).

While officials will almost certainly downplay the situation in order to prevent panic, it’s important to remember that genuine public health crises are virtually always preceded by government duplicity.

TEPCO and the Japanese government have repeatedly been caught lying in their efforts to downplay the scale of the Fukushima disaster. In September it was confirmed that radiation readings around the power plant were 18 times higher than previously reported by TEPCO. After a tank leaked 300 tonnes of toxic water in August, groundwater radiation readings at the plant soared to 400,000 becquerels per litre, the highest reading since the nuclear accident occurred in March 2011.

EPA officials in America also lied in the weeks after 9/11 when they told rescue workers and the general public that the air at ground zero was safe to breathe. According to insiders, EPA officials knew that the dust in the air was laden with asbestos but chose to cover up the truth, leading to at least 20,000 ground zero workers suffering debilitating illnesses and numerous deaths.

Mainstream media outlets have also largely toed the line on Fukushima despite overwhelming evidence of a cover-up of the true scale of the crisis by Japanese authorities. Former MSNBC host Cenk Uygur was told not to warn the public about the danger posed by the meltdown at the Fukushima nuclear plant during his time as a host on the cable network.

Concerns that the federal government is preparing for some form of nuclear emergency have heightened after it was revealed that the Department of Health and Human Services has ordered 14 million doses of potassium iodide, the compound that protects the body from radioactive poisoning in the aftermath of severe nuclear accidents, to be delivered before the beginning of February.

Further Reading

Obama issues executive order to prepare for global warming

President Obama issued an executive order Friday directing a government-wide effort to boost preparation in states and local communities for the impact of global warming.

The action orders federal agencies to work with states to build “resilience” against major storms and other weather extremes. For example, the president’s order directs that infrastructure projects like bridges and flood control take into consideration climate conditions of the future, which might require building structures larger or stronger — and likely at a higher price tag.

Read more:
In the past 7 days there have been 28 solar flares, and more may be on the way.  Storms such as Katrina, Sandy, and others are becoming more and more common.  The ‘500 year’ storm that supposedly occurs once every 500 years seems to be happening every 5 years.  Global Warming is usually discussed in the context of a longer term situation that may be a problem for future generations or for animal species threatened by melting ice caps.  But a storm such as Sandy had a huge economic impact on the Atlantic seaboard.  An EMP event could have a devastating impact on the economy and IT systems.  Preparing for “Global Warming” may be parlance for beefing up infrastructure to prepare for “impacts” of many kinds.  It serve no purpose for the government to have a constant state of emergency due to natural disasters caused by Global Warming.  This order may signal the beginning of a larger campaign, and that the US government has joined the ‘prepper’ movement.

The origins of the modern Survivalist movement in the United Kingdom and the United States include government policies, threats of nuclear warfare, religious beliefs, and writers who warned of social or economic collapse in both non-fiction and apocalyptic and post-apocalyptic fiction.

The Cold War era Civil Defense programs promoted public atomic bomb shelters, personal fallout shelters, and training for children, such as the Duck and Cover films. The Church of Jesus Christ of Latter-day Saints (LDS) has long directed its members to store a year’s worth of food for themselves and their families in preparation for such possibilities.[1] The current LDS teaching advises a three-month supply.

America’s history of chemical weapons ‘experiments’ against its own people: Over 4,000 radiation experiments killed or poisoned hundreds of thousands of citizens

In preparing America for nuclear attack during the Cold War years following World War II, thousands of US citizens became the innocent victims of over 4,000 secret and classified radiation experiments conducted by the Atomic Energy Commission (AEC) and other government agencies, such as the Department of Defense, the Department of Health, Education and Welfare, the Public Health Service (now the CDC), the National Institutes of Health, the Veterans Administration (VA), the CIA, and NASA.

Millions of people were exposed to radioactive fallout from the continental testing of more than 200 atmospheric and underground nuclear weapons, and from the hundreds of secret releases of radiation into the environment. Over 200,000 “atomic vets” who worked closely with nuclear detonations at the Nevada test site during the 1950s and 1960s were especially vulnerable to radiation fallout.

Also affected were the thousands of so-called “downwinders”, who lived in nearby small towns in Nevada, Utah, Colorado and New Mexico. These downwinders (along with the animal populations) suffered the worst cumulative radioactive effects of fallout, along with a contaminated environment teeming with radioactive food and farm products. The plight of these poor country people exposed to government-induced radiation sickness has been recorded in Carole Gallagher’s remarkable photo-essay American Ground Zero: The Secret Nuclear War (The Free Press, 1993).

In reviewing declassified AEC records (now the Department of Energy) from the 1950s, Gallagher was shocked to discover one document that described the people downwind of the Nevada Test Site as “a low use segment of the population.” Her shock at such callous bigotry caused her to eventually move West to research, investigate and document those who lived closest to the Test Site, as well as workers at the site, and soldiers repeatedly exposed to nuclear bombs during the military tests.

Disinformation and Nuclear Fallout

In the nuclear arms race, government doctors and scientists brainwashed the public into believing low dose radiation was not harmful. Some officials even tried to convince people that “a little radiation is good for you.” Totally ignored was the knowledge that the radiation from nuclear fallout could lead to an increased risk of cancer, heart disease, neurological disorders, immune system disease, reproductive abnormalities, sterility, birth defects, and genetic mutations which could be passed on from generation to generation. The full extent of this radiation damage to the American public during the Cold War years will never be known.

A secret AEC document, dated 17 April 1947, reveals that physicians were aware of these radiation hazards but simply ignored them. Under the title “Medical Experiments in Humans,” the memorandum read: “It is desired that no document be released which refers to experiments with humans that might have an adverse effect on public opinion or result in legal suits. Documents covering such field work should be classified ‘Secret’.”

According to Gallagher, many downwinders testified that the Public Health Service officials told them that their ‘neurosis’ about the fallout was the only thing that would give them cancer, particularly if they were female. Women with severe radiation illness, hair loss, and badly burned skin, were clinically diagnosed in hospitals as “neurotic.” Other severely ill women were diagnosed with “housewife syndrome.” When Gallagher’s investigation led her to ask a Department of Energy spokesperson about the AEC/DOE’s practice of waiting until the wind blew towards Utah before testing nuclear bombs or venting radiation in order to avoid contaminating Las Vegas or Los Angeles, the unabashed and unconcerned official actually said on tape, “Those people in Utah don’t give a shit about radiation.”

Secret Radiation Experiments

Only recently, with the forced release of Top Secret documents, have details been revealed about the unethical and inhumane radiation studies conducted during the Cold War years from 1944 to 1974. The initial story broke in November 1993 in a series of articles in the Albuquerque Tribune which identified the names of 18 Americans secretly injected with plutonium, a key ingredient of the atomic bomb and one of the most toxic substances known to man. Some, but not all, of the patients were terminally ill. This horrifying story by journalist Eileen Welsome (who later won a Pulitzer Prize) unleashed a storm of nationwide protest prompting Department of Energy Secretary Hazel O’Leary to order the release of secret files and documents pertaining to these Cold War experiments.

The extremely dangerous plutonium experiment was performed under the auspices of the government’s Manhattan Project, which brought together a revered group of distinguished scientists to develop and test the atom bomb. The purpose of these secret experiments was to establish occupational standards for workers who would be producing plutonium and other radioactive ingredients for the nuclear energy industry.

Some of the classified government experiments included:

  • Exposing more than 100 Alaskan villagers to radioactive iodine during the 1960s.
  • Feeding 49 retarded and institutionalised teenagers radioactive iron and calcium in their cereal during the years 1946-1954.
  • Exposing about 800 pregnant women in the late 1940s to radioactive iron to determine the effect on the fetus.
  • Injecting 7 newborns (six were Black) with radioactive iodine.
  • Exposing the testicles of more than 100 prisoners to cancer-causing doses of radiation. This experimentation continued into the early 1970s.
  • Exposing almost 200 cancer patients to high levels of radiation from cesium and cobalt. The AEC finally stopped this experiment in 1974.
  • Administering radioactive material to psychiatric patients in San Francisco and to prisoners in San Quentin.
  • Administering massive doses of full body radiation to cancer patients hospitalised at the General Hospital in Cincinnati, Baylor College in Houston, Memorial Sloan-Kettering in New York City, and the US Naval Hospital in Bethesda, during the 1950s and 1960s. The experiment provided data to the military concerning how a nuclear attack might affect its troops.
  • Exposing 29 patients, some with rheumatoid arthritis, to total body irradiation (100-300 rad dose) to obtain data for the military. This was conducted at the University of California Hospital in San Francisco.

The Atomic Energy Commission

In 1995 the Energy Department admitted to over 430 radiation experiments conducted by the Atomic Energy Commission between the years 1944 and 1974. Over 16,000 people were radiated, some of whom did not know the health risks or did not give consent.

These experiments were designed to help atomic scientists understand the human hazards of nuclear war and radiation fallout. Because the entire nuclear arms buildup was classified secret, these experiments were all stamped secret and allowed to take place under the banner of protecting “national security.”

Amazingly, these clandestine studies were conducted at the most prestigious medical institutions and colleges, including the University of Chicago, the University of Washington, the Massachusetts Institute of Technology, Vanderbilt University in Nashville, and the previously mentioned universities.

Uranium Mine Workers

In addition to these radiation experiments, workers who mined uranium for the AEC in the Four Corners area of Arizona, Utah, Colorado and New Mexico, were exposed to radioactive dust during the 1940s up to the 1960s. Although AEC scientists and epidemiologists knew the dust in these poorly ventilated mines was contaminated with deadly radon gas which could easily cause death from lung cancer, this lifesaving information was never passed on to the miners, many of whom were Native Americans. As a result, many miners died prematurely of cancer of the lung.

Stewart Udall, an Arizona Congressman and lawyer who also served as Secretary of the Interior during the Kennedy and Johnson administrations, represented the miners and their families in a class action lawsuit against the federal government for radiation injuries. In The Myths of August, Udall writes that some physicians who defended the decisions of the atomic establishment sought to justify these experiments by contending that little was known about the health risks associated with the various exposures. Others tried to put a positive face on tests conducted without obtaining informed consent by maintaining that these experiments nevertheless produced advances in medical knowledge. Some physicians argued that the conduct of the AEC doctors should be condoned because they were merely following the ‘prevailing ethics’ of the postwar period. When the miners’ case finally came to trial in 1983, the federal court in Arizona dismissed the case by declaring the US government was immune from lawsuit.

Medical Ethics of the Cold War

How could these physician-experimenters ignore the sworn Hippocratic Oath promising that doctors will not harm their patients? Did they violate the Nuremberg Code of justice developed in response to the Nazi war crimes trials after World War II?

The Nuremberg Code includes 10 principles to guide physicians in human experimentation. In actuality, prior to the Nazi war crime tribunals, there was no written code for doctors; and lawyers defending the Nazi doctors tried to argue that similar wartime experiments were conducted with prisoners at the Illinois State Penitentiary, who were deliberately infected with malaria.

During the Nuremberg trials the AMA came up with its own ethical standards, which included three requirements: 1) voluntary consent of the person on whom the experiment is to be performed must be obtained; 2) the danger of each experiment must be previously investigated by animal experimentation; and 3) the experiment must be performed under proper medical protection and management.

The records now show that many victims of the government’s radiation experiments did not voluntarily consent as required by the Code. As late as 1959, Harvard Medical School researcher Henry Beecher viewed the Code “as too extreme and not squaring with the realities of clinical research.” Another physician said the Code had little effect on mainstream medical morality and “doubted the ability of the sick to understand complex facts of their condition in a way to make consent meaningful.”

Writing in the Journal of the American Medical Association in 1996, Jay Katz recalls an argument at Harvard Medical School in 1961 suggesting that the Code was not necessarily pertinent to or adequate for the conduct of research in the United States. Katz writes: “The medical research community found, and still finds, the stringency of the NC’s first principle all too onerous.” But patients in medical experiments expect the experiment to help them in some way – not to harm them! Patients also are often inclined to totally trust their physicians not to harm them. In The Nazi Doctors and the Nuremberg Code, Katz concludes that many doctors view the Code as “a good code for barbarians but an unnecessary code for ordinary physicians.”

The President’s Advisory Committee

In January 1994 President Clinton convened an Advisory Committee to investigate the accusations surrounding the human radiation experiments. In their final report presented to the president on 3 October, 1995, the Committee found that up to the early 1960s it was common for physicians to conduct research on patients without their consent.

The Committee’s harshest criticism was reserved for those cases in which physicians used patients without their consent in experiments in which the patients could not possibly benefit medically. These cases included the 18 people injected with plutonium at Oak Ridge Hospital in Tennessee, the University of Rochester in New York, the University of Chicago, and the University of California at San Francisco, as well as two experiments in which seriously ill patients were injected with uranium, six at the University of Rochester and eleven at Massachusetts General Hospital in Boston. The plutonium and uranium experiments undoubtedly put the subjects at increased risk for cancer in ten or twenty years’ time.

The Final Report of the President’s Advisory Committee is now available in The Human Radiation Experiments, published in 1996 by Oxford Press. Although the Committee studied the experiments in depth, there was no attempt to assess the damage done to individuals. In many cases, the names and records of the patients were no longer available, nor was there any easy way to identify how many experiments had been conducted, where they took place, and which government agencies sponsored them. The Department of Health and Human Services, the primary government sponsor of research, had long since discarded files on experiments performed decades ago.

The Committee discovered “the records of much of the nation’s recent history had been irretrievably lost or simply could not be located” and “only the barest description remained” for the majority of the experiments.

The Department of Energy also claimed all the pertinent records of its predecessor, the AEC, had been destroyed during the 1970s, but in some cases as late as 1989. All CIA records are classified. When records of the top secret MKULTRA program (in which unwitting subjects were experimented upon with a variety of mind-altering drugs) were requested, the CIA explained that all pertinent records had been destroyed during the 1970s when the program became a national scandal.

Keeping Government Secrets

The Committee made clear that its story could not have been told if the government did not keep some records that were eventually retrieved and made public. However, federal records management law also provides for the routine destruction of older records. Thus, in the great majority of cases the loss or destruction of requested documents was a function of normal record-keeping practices.

The Committee was dismayed to report: “At the same time, however, the records that recorded the destruction of documents, including secret documents, have themselves been lost or destroyed.” Thus, the circumstances of destruction (and indeed, whether documents were destroyed or simply lost) is often hard to ascertain.

In the Committee’s judgment the AEC had repeatedly deceived the public by denying it had engaged in human experimentation, and by issuing cover stories to cover-up secret investigations, and by deliberately supplying incomplete information to people who participated in government-sponsored biomedical research. It was clear that once government information was “born secret” it often remained that way.

The Committee concludes: “The government has the power to create and keep secrets of immense importance to us all.” Yet, without documents how can historians and other researchers uncover the truth about the government’s clandestine activities? Where is the ‘smoking gun’ when secret records are systematically shredded or reported as ‘lost’? We now know that many people were damaged during the government’s Cold War period of secrets and lies. But how can we uncover the medical and scientific secrets that remain hidden in the still classified documents from 1974 up to the present?

In the absence of medical records and follow-up, the ultimate fate of individuals who willingly or unwillingly “volunteered” for these experiments is not known. The Committee simply did not have the time or the resources to review individual files and histories. In many instances only fragmentary information survives about these experiments; whether people were harmed in these experiments could not be ascertained.

Current Secret Biomedical Experimentation

The US has the world’s largest arsenal of chemical and biological weapons. However, few people are aware of the covert biowarfare experiments conducted by various government agencies, particularly the military and the CIA.

For example, in August 1977 the CIA admitted to no less than 149 subprojects, including experiments to determine the effects of different drugs on human behaviour; work on lie-detectors, hypnosis, and electric shock; and the surreptitious delivery of drug-related materials. Forty-four colleges and universities were involved, along with fifteen research foundations, twelve hospitals or clinics, and three penal institutions. In the infamous MKULTRA mind-altering experiments, the victims were lured to hotel rooms for sexual encounters with prostitutes and were then drugged and monitored by CIA agents.

Military biowarfare attacks against unsuspecting Americans in the 1950s and 60s are a documented reality. The most notorious was a six-day US military bioattack on San Francisco in which clouds of potentially harmful bacteria were sprayed over the city. Twelve people developed pneumonia due to these infectious microbes, and one elderly man died from the bioattack.

In other secret attacks, bacteria were sprayed into New York City subway tunnels; into crowds at a Washington, D.C. airport; and onto highways in Pennsylvania. Biowarfare testing also took place in military bases in Virginia, in Key West, Florida, and off the coasts of California and Hawaii.

For 50 years the shameful details of the government’s radiation experiments were kept secret from the public. In The Plutonium Files, Eileen Welsome notes the ethical horror that resulted from the melding of military and medical agendas during the Cold War. She credits the atomic bomb project’s public relations machine for downplaying the fallout controversy, the illnesses of the atomic veterans, and the diseases of the downwinders. The government propagandists simply placed the blame on sudden wind shifts, misinformed scientists, the overactive imagination of aging soldiers, and even Communist propagandists.

Welsome concludes: “The web of deception and denial looks in retrospect like a vast conspiracy, but in actuality it was simply a reflection of the shared attitudes and beliefs of the scientists and the bureaucrats who were inducted into the weapons program at a time of national urgency and never abandoned their belief that nuclear war was imminent.” She worries if what we have learned from the thousands of radiation experiment documents made public over the last several years will be remembered. Like the Holocaust and the Nazi crimes against humanity, the radiation experiments should never be forgotten.

In reviewing Welsome’s book for the Los Angeles Times (2 January, 2000), Thomas Powers asks: “If the government lied about the danger of nuclear testing, can we trust them to tell us the truth about acid rain, global warming or the safety of deep storage for nuclear waste?”

Does Secret Medical Experimentation Continue?

To this day there are no adequate safeguards to protect people from secret government experimentation. Since the mid-1970s we have witnessed the spectacular rise of genetic engineering and molecular biology, as well as the concomitant outbreak of new and mysterious diseases like AIDS, chronic fatigue syndrome, the peculiar “Four Corners” lung disease discovered on Navajo land, and the appearance of unprecedented “emerging” viruses never before seen on the planet.

Investigators linking the possible origin of these diseases to the dangerous engineering of new microbes are often dismissed as paranoids and crackpots. The mysterious Persian Gulf War syndrome is yet another recent illness clouded in military and biologic secrecy, with the origin and cause still debated and the medical records of sick veterans often “lost” or otherwise unavailable. Not surprisingly, the same government institutions that funded the radiation experiments now largely control the research, the funding, and the cover stories pertaining to all these new diseases and viruses.

What is clear from studying the Committee’s Final Report is that the medical and scientific professions collaborated with the government and the military to abuse and harm US citizens. In the process, the nuclear establishment literally got away with murder. And there is simply no end to the secrets that still emerge from the Cold War years that began 58 years ago with the Manhattan Project.

In January 2000, the government presented the results of a statistical study showing that atomic workers employed in the nuclear weapons industry during the Cold War were more likely to suffer a higher rate of cancer, due to their exposure to cancer-causing radiation and chemicals.

From the 1940s up to the present time, government lawyers and scientists have repeatedly rejected the claims of workers who became sick as result of nuclear radiation and exposure to deadly uranium, plutonium, and fluorine. As many as 600,000 workers in 14 nuclear weapons plants are now affected by the government’s final admission of wrongdoing in exposing these people to cancer and other chronic illnesses.

According to a Los Angeles Times report, “workers told of spending years trying to get compensation payments from the state, of having to hire attorneys to get disability pay, of going to clinics that forced them to sign away rights to a portion of any future disability payment before they could be treated.”

Kay Sutherland, a worker at the Hanford plutonium plant in central Washington State, told a hearing that “the people in this area have been forced into poverty because they’ve had to retire in their 30s, 40s, and 50s, too young to get a retirement, and too young to get Social Security. They fall through the cracks and they die.” Sutherland has lost four of her five family members to disease, and has an enlarged liver and multiple tumours. She considers herself “a Holocaust survivor for the American Cold War.”

How can we stop these nuclear and biological horrors, which have condemned thousands of innocent people to disease and death? Why must decades of government-sanctioned medical abuse be kept secret and covered-up by scientists and physicians who claim to be concerned about the health of the public?

One way to prevent abuse might be to bring the physician-scientist perpetrators of these experiments to justice in a court of law. However, unless the public is aroused, this is unlikely to happen.

Writing in the Columbia Journalism Review, Geoffrey Sea notes: “A startling fact about the experiments is that, despite the documentation of hundreds of cases of unethical conduct resulting in lasting damage to thousands of people, not a single physician or nurse, scientist or technician, policy maker or administrator has yet come forward to admit wrongdoing.”

For over twenty years the law allowed the US Department of Defense (DoD) to use Americans as “guinea pigs.” This law (the US code annotated Title 50, Chapter 32, Section 1520, dated 30 July, 1977) remained on the books until it was repealed under public pressure in 1998. The new and revised bill prohibits the DoD from conducting tests and experiments on humans, but allows “exceptions.” One of the exceptions is that a test or experiment can be carried out for “any peaceful purpose that is related to a medical, therapeutic, pharmaceutical, agricultural, industrial, or research activity.” Thus, the 1998 law has obvious loopholes which allow secret testing to continue. For details on the restrictions (and exceptions) for human testing for chemical and biological agents, consult the Gulf War Vets website.

Unethical and dangerous experimentation undoubtedly continues in secret up to the present time, ostensibly under the guise of “national security.” Thus, it would seem prudent for patients to think twice before signing-up for government-sponsored medical studies, particularly at leading medical institutions. Enlightened patients might also view doctors (and scientists) with a healthy dose of skepticism, and a touch of paranoia.

As weird as all this sounds, it could save your life!


Cantwell AR Jr: Queer Blood: The Secret AIDS Genocide Plot. Aries Rising Press. Los Angeles, 1993.

Declassified: ‘Human Experimentation’ (Video, 1999). A&E Television. Distributed by New Video, 126 Fifth Avenue, New York, NY 10011.

Faden RR, Lederer SE, Moreno JD: “U.S. medical researchers, the Nuremberg Doctors Trial, and the Nuremberg Code: A review of findings of the Advisory Committee on human radiation experiments.” JAMA 276:1667-1671, 1996.

Faden R; “The Advisory Committee on human radiation experiments: Reflections on a presidential committee.” Hastings Center Report 26 (no.5): 5-10, 1996

Gallagher C: American Ground Zero: The Secret Nuclear War. The Free Press, New York, 1993.

Harris R and Paxman J: A Higher Form of Killing: The Secret Story of Chemical and Biological Warfare. Hill and Wang, New York, 1982.

Katz J: The Nazi Doctors and the Nuremberg Code. Oxford University Press, New York, 1993.

Katz J: “The Nuremberg Code and the Nuremberg trial.” JAMA 276: 1663-1666, 1996.

Murphy K: “Government finally hears a nuclear town’s horrors.” Los Angeles Times, February 5, 2000.

Sea G: “The radiation story no one would touch.” Columbia Journalism Review, March/April 1994.

The Human Radiation Experiments: Final Report of the Advisory Committee on Human Radiation Experiments. Oxford University Press, New York, 1996.

Udall SL: The Myths of August: A Personal Exploration of Our Tragic Cold War Affair with the Atom. Pantheon Books, New York, 1994.

Watts ML: “U.S. acknowledges radiation caused cancers in workers.” New York Times, January 29, 2000.

Welsome E: The Plutonium Files: America’s Secret Medical Experiments in the Cold War. The Dial Press, New York, 1999.

Numerical study and prediction of nuclear contaminant transport from Fukushima Daiichi nuclear power plant in the North Pacific Ocean

Download PDF – 163857159-TTMYGH-26-Aug-2013

Fukushima Research

Model simulations on the long-term dispersal of137Cs released into the Pacific Ocean off Fukushima – OceanRep

Time for Japan to take control of the Fukushima disaster | Greenpeace International

NYTimes: 400 tons of highly radioactive water going into Pacific each day from Fukushima plant, says Tepco — Top Nuclear Regulator: This is a crisis

Tepco Official: This is extremely serious — We are unable to control radioactive water seeping out of Fukushima plant (VIDEO)

Fukushima drainage has 20,000 tons of water with radioactive substance – TEPCO — RT News

BBC News – Fukushima radioactive water leak an ’emergency’

Spill-over threat: Fukushima radioactive groundwater rises above barrier level — RT News


Radiation Of The Pacific Ocean In The Next 10 Years – Japan Nuclear Disaster

Fukushima NHK Documentary Two Years Later

Sources – Energy News

Fukushima Diary

Fukushima Update | Nuclear News from Japan

Fukushima Reactor Disaster – Archive with hundreds of articles and videos


FUKUSHIMA PHASE II: The Coriums Enter the Hydrosphere


Recent disclosures by TECPO of severely contaminated groundwater at the Fukushima Daiichi reactor complex have been raising eyebrows in activist communities and beyond. TEPCO has recently been finding in boreholes drilled just meters from the ocean stratospheric quantities of radioactivity, including cesium-134, cesium-137, strontium-90 and tritium. At first, TEPCO said this radiation – near the shoreline, but below the surface – originated ‘from initial leaks that have remained since earlier in the crisis’ and that stayed near the plant inside the bay. BUT THEY WERE WRONG! This was from leaks. This was radiation that was bleeding from the reactors or the coriums and contaminated the groundwater, which has been flowing to sea all along. The most recent alarming news is that TEPCO found 2.35 *tuh* trillion becquerels of radioactive cesium per cubic meter of groundwater in a trench beside reactor 2.

Since early July 2013, measurements peaking into the hundreds of BILLIONs of becquerels per cubic meter (denoted as Bq/m3) of beta radiation have been found in groundwater at the plant! Beta radiation includes such radionuclide groups as cesium, strontium, tritium, and dozens of other isotopes of chemical elements formed by nuclear fission (look at a trilinear chart; the betas are in broken-lined boxes). TEPCO is reluctant to test for strontium-90. STRONTIUM-90 IS WORSE THAN CESIUM-134 OR CESIUM 137 OR TRITIUM to human health. That’s why I focus on this radiochemical across many of the hundreds of pages on Read more about why Strontium-90 is a worse health threat than cesium.

When The Pacific Was Last Severely Contaminated

In 1954, the U.S. detonated its largest nuclear bomb ever. Code-named ‘Bravo,’ the hydrogen bomb explosion was 10,000 times the explosive yield of the A-bomb device it dropped 9 years earlier on Hiroshima.

The target: Bikini Atoll in the central Pacific Ocean. This ultra-radiation release event showered a type ofradioactive snow on a populated island that the U.S. refused to evacuate for 50 hours – this was part of the notorous Project 4.1. The event also tainted the ocean so badly that hundreds of fishing boat hulls became contaminated, fishing sites became too ‘hot’ to fish, etc… and the effects lingered on for years. The currents pushed much of this radiation, and other radiation generated by USSR and U.S. nuclear dumping and bomb testing in the 1950s, to the Eastern Pacific. We’re getting to this point quickly again with Fukushima’s radioactive groundwater ‘flood.’

TEPCO knows what they’re doing. They are trying to keep people apathethic about this disaster and are being advised by the best spin-doctors in the world. For over 2 years now, TEPCO has done their informational ‘drip drop’ thing for all sorts of data, and now they’re doing it with strontium-90 levels measured in groundwater. They’re withholding most of this data and waiting, very carefully, to release this data until people no longer care or remember there was even data to be shared. Yet, we see from the few disclosures to date that strontium-90 levels keep increasing in groundwater with each sample; see chart in light blue color.

So, is strontium-90 going to be (or already is) a problem in the PACIFIC OCEAN from historic AND future groundwater releases at Fukushima? Strontium, like cesium, is WATER SOLUBLE, UNLIKE PLUTONIUM AND MANY OTHER RADIOCHEMICALS, so we can expect groundwater to be full of the stuff. And…THERE IS A LOT OF STRONTIUM LEFT TO BE LEACHED from the coriums, which are water-drenched, meaning they are slowly leaching radiation into the groundwater. The coriums at Fukushima contain MILLIONS OF TRILLIONS of becquerels of strontium-90 just waiting to be released…and since the corium’s whereabouts are unknown, it’s plausible that groundwater contamination could be getting worse, NOT BETTER!

So, let’s say, soon, TEPCO finds 50 billion becquerels per cubic meter of strontium-90 in groundwater. That’s 50,000,000,000 Bq/m3 of sr90. In a leak involving the volume of water in an Olympic sized swimming pool (which holds 2500 m3 of water), that’s 125 TRILLION becqueruels of strontium-90! That’s 125 trillion becquerels of strontium-90 in one big pool’s worth of water !! Scientists are more or less in consensus – though this is a conservative estimate in my estimation – that Fukushima in 2011 released into the marine environment 1,000 TRILLION becquerels of strontium-90 (into the Pacific). So, over time, and we’re talking just a few months from now, a volume of 10 (ten) Olympic pools of contaminated groundwater containing 50 billion bq/m3 of Sr90 would exceed the most conservative estimate of strontium-90 load put into the Pacific from the 2011 meltdowns (volatilized sr90), dumping and leaks combined! On July 12, 2013, a TEPCO worker said 450 tons of groundwater flow into the Pacific everyday; so it would take just 55 days for 10 volumes of Olympic swimming pools to meet this goal! 1 ton of water equals about 1 cubic meter of water). Perhaps this has already happened, or has happened a few times, since TEPCO has confirmed underground leaks to the sea have been happening since 2011.

We have a real crap-fest on our hands if this keeps going on and no one stops the groundwater flow. On August 5th of 2013, it was learned that tainted groundwater has breached an underground barrier at the Fukushima Daiichi plant’s sea front, an event that Japan’s nuclear regulatory body, the NRA, dubbed an emergency (data now shows that underground water has risen over this underground wall, a sort of chemical liquid glass barrier; the conclusion: tainted water is now leaking uncontrolled into the sea). Critics have long been saying that barriers being built by TEPCO around its reactors will simply back up the groundwater and cause new challenges, from imminent overflow (and faster flows of tainted water) to thwarting or overwhelming pumping AND STORAGE capability to literally crushing the reactor structures from the water pressure! If the entire REACTOR site becomes a type of ‘soup’ – if it is all turned into the saturated zone – the reactors could lose their stability and move, or sink. In an earthquake, the moist ground could experience liquefaction: it could behave as water even though it (the soil) is a solid. This would be very bad.

With this latest breach, TEPCO said that it was going to increase its testing for strontium-90 in seawater/groundwater despite an announcement by the utility in mid July 2013 that it was (a) reducing the frequency of analysis of Sr90 in sea samples and (b) was planning to cease reporting measurements (of Sr90) in becquerels, and instead in (erroneous) units of dose or other parameters meant to confuse and mislead, IMO. We shall see what happens. The fact remains that strontium-90 poses a greater danger, unit for unit, than tritium or cesium to human health and the environment, and strontium-90 will be entering the environment via groundwater flows in amounts many times that released by air during the meltdowns in 2011.

The worrisome part is we could soon be seeing radioactive measurements of strontium, cesium, tritium, and soon perhaps even plutonium, reaching record high levels as TEPCO throws up its hands and says it has no way or plan of dealing with UNCONTROLLED groundwater leaks for years, or decades!



SEAFOOD LOVERS ACROSS THE WORLD: – The ‘levels’ of radiation in the seafood you are eating now and in the future will be ticking with radiation but will be considered ‘safe’ by government scientists. Why? Because their ‘limits’ are way too high. Yet it is fact that the only safe level of radiation to reduce genetic defects and cancer incidence worldwide is zero becquerels of anything. I COULD ADD TONS MORE….BUT YOU PROBABLY WON’T BELIEVE ME AND THIS WEBSITE IF IT CONTRADICTED THE RHETORIC OF THE AGENCIES YOU HAVE COME TO TRUST WHICH HAVE ACRONYMS THAT SHOULD BE SHORT FOR DEROGATORY AND INSULTING TERMS INSTEAD OF WORDS LIKE ‘REGULATORY’ OR ‘HEALTH’ OR ‘PROTECTION.’ Protect your own health and regulate your genetic stability for the sake of your children, grandchildren, etc…by NOT EATING SEAFOOD OR CONSUMING ANYTHING MADE IN THE SEA.


TEPCO has seen the light and now has admitted what the experts were suspecting. Leaks are happening. The plant operator has also validated a theory put forward by a well-known, tireless blogger last week that tidal movements were responsible for some of this leakage. Iori of Fukushima-Diary wrote on his blog’s July 19th post ‘Possibility of seawater coming up to the plant side’ the following:

‘It’s the possibility that seawater may come into the plant, and also contamination is flowing to the sea groundwater….Water flows from high to low. It’s natural to think the contaminated groundwater is flowing to the deeper part of the sea, which is underground of the sea. Wrapping it all, it seems rational to think groundwater and seawater are moving back and forth as full tide repeats. By repeating the motion of the tide, contamination is possibly moving back and forth between land and sea too. As long as they keep injecting water to the reactors, Fukushima plant will keep being the worst contamination source of the Pacific.’

In late July 2013, The Asahi Shimbun stated that TEPCO had ‘confirmed water level variations in monitoring wells as early as in January [2013]’; in other words, TEPCO knew since the beginning of 2013 that the ‘wells where the radioactivity was detected were fluctuating in sync with tide levels.’ TEPCO’s delay in relaying this information to the public was blamed on poor ‘in-house communications,’ which has angered many groups in Japan.

Strontium-90 Findings in Fukushima Daiichi Groundwater

8,300 Bq/m3 | sample date: 1?/?/12 | publish date: 12/12/12 beside reactor3

850,000 Bq/m3 | sample date: 3/26/12++ | publish date: 6/7/13

1,000,000 Bq/m3 | sample date: 5/24/13 | publish date: 6/19/13

1,200,000 Bq/m3 | sample date: 6/07/13 | publish date: 7/11/13 boring no. 1

++During a four-hour leak event (into the ocean) in March 2012, TEPCO had measured – 10 minutes BEFORE finding the leak! – 17 million Bq/m3 of beta radiation in water and five percent of that was strontium-90.

Thanks Fukushima Diary for their work! It’s helping us.

Maritime Safety Agency, ‘Sr-90 and Cs-137 density in seawater became the highest in past 40 years’

SEAWATER RESULTS: 5,800 Bq/m3 (157 pCi/L) | sample date: 6/26/13 | publish date: 7/31/13 | north side of units 5 & 6 water outlet

If Iori is right, then deep contamination may be a huge problem, and TEPCO isn’t looking for it; they’re just looking at seawater in its port. Radiation could be emerging in large quantities daily from a spring or whatchamacallit hundreds of meters out at sea – or even miles or tens of miles away – where TEPCO is barely monitoring; according to Fukushima Diary, TEPCO has ‘only three monitoring points outside of the port…’ Two are ‘close to the coastal line’ and in a 1 kilometer radius area offshore they have just one monitoring point.’ (source)

What this means…is there’s an entire set of NEW plumes of highly contaminated seawater that are floating about and no one has any idea of when these plumes left the plant, their toxicity level or makeup. Some of this undiagnosed leaked radiation in the ocean is mixing with OLDER plumes, especially near the coast, to create significant spikes in contamination in ocean water and fish (seafood).

FLOOD, NOT LEAK – from ‘The Food Lab’ guy: ‘They use the word leak, to play it down. Their estimate is 450 tons a day of ground water flowing into the Pacific Ocean. This is probably a very conservative estimate! It’s a flood, not a leak!’


Carbon 14 and the 411 on Corium – July 29, 2013 – Bobby1, a blogger at WordPress, explores the hidden story of the latest at Fukushima. Excerpts:

‘The neutrons from the chain reactions, along with neutron-emitting isotopes like plutonium and curium, interact with the nitrogen injections to cause carbon-14 to be released. There have been no measurements made of this isotope. Along with tritium (from neutrons and boric acid injections), this isotope is most deleterious to plant life. Linus Pauling, the two-time Nobel Prize-winning chemist, felt that carbon-14 was the most dangerous isotope of them all.’…

‘Unit 3, which has been giving off a light show on the Daiichi wecams the past few weeks, has been giving off steam. Nitrogen has been injected into this reactor continuously, in order to prevent a hydrogen explosion. There is a shortfall in nitrogen recovered from the gas management system. This shows that the steam is coming from the reactor, or from the hole in the ground underneath the reactor, with MOX corium down there…The underground corium has encountered water. So there is a risk of a hydrovolcanic explosion with this plutonium-rich fuel.’ {More:Cesium density in reactor3 “steam” higher than the gas inside of PCV}

More articles about Fukushima and health

Skyrocketing Levels of Strontium-90 Being Found in Groundwater at Fukushima. A Look at ‘Why?’ – July 2013

FAQs on radioactive contamination of food in U.S.

‘Japan Finds Radioactivity in More Foods from California: The California Radiation Report’– chart of contamination levels

‘Vital Information that U.S. Scientists and the U.S. Government Isn’t Telling You about Pacific Seafood Tainted by Fukushima,’ by; update on tuna situation update on bluefin tuna study

The threat posed to health by eating radioactive foods

More analyses on Fukushima

Fukushima’s NOT over. IT’S ACTUALLY GETTING WORSE. Please donate if you like seeing our coverage on this situation. DUST OFF YOUR BLOGS AND GET ACTIVE AGAIN ABOUT FUKUSHIMA! Share, like, tweet, flip out!

IS THE PACIFIC OCEAN FOOD CHAIN DOOMED? – May 2013 – Incredibly worrisome levels of cesium, including short-lived radioactive cesium-134, have been found near Hawaii in the LOWEST part of the marine food chain: plankton. Levels up the food chain, i.e. fish, whales, seals, due to bioaccumulation, MUST be magnitudes higher in contamination now or soon – stop eating Pacific wild seafood now – Researchers find high cesium in some Pacific plankton

New Zealand not so pure

New Zealand’s green claims are pure manure: Country’s food scares and poor environmental record at odds with ‘100% Pure’ slogan

  • Despite marketing claims, New Zealand has a poor environmental record
  • Food scares and poor water quality cast doubt on ‘100 per cent’ pure slogan
  • Revelations may hit nation’s food export industry
Pure? New Zealand's poor environmental record has cast doubt on its claims to be '100% Pure'Pure? New Zealand’s poor environmental record has cast doubt on its claims to be ‘100% Pure’

For a country that markets itself to the world with the slogan ‘100% Pure’, New Zealand’s environmental credentials are not as impeccable as many would think.

The majority of its rivers are too polluted to swim in. Its record on preservation of natural environments is among the worst in the world on a per capita basis.

And it is the only OECD country that does not produce a regular national report on its environment.

The discovery by dairy giant Fonterra of a bacteria that can cause potentially fatal food poisoning in ingredients sold to eight countries exposes New Zealand’s vulnerability to food safety scares and the fragility of the clean, green image underpinning its farming- and tourism-based economy.

Agricultural exports, including dairy, meat, fruit and wine, command high premiums internationally thanks to New Zealand’s reputation as a producer of safe, natural and high-quality food.

‘It was only a matter of time before our dirty little secret came out,’ said Jill Brinsdon, brand strategist at Radiation, a brand agency in Auckland.

‘Fonterra is our largest exporter and they’re completely intertwined with New Zealand’s image and also they’re the absolute biggest benefactor of the “100% Pure” brand.

‘When you’re coming out with something that presents itself as fact, or 100 per cent pure, then you have to be 100 per cent pure and we’ve proven that we’re not.’

New Zealand’s primary sector, which includes fishing and forestry, accounts for some 60 per cent of exports and 18 per cent of the country’s $160billion GDP, among the highest proportions in the developed world. Tourism makes up another 10 percent or so of GDP.

The country has long marketed itself internationally with the ‘100% Pure’ slogan in print and TV ads, drawing millions of visitors each year to experience its national parks, beaches and lakes.

Pollution: Early morning light shines through the smog in Christchurch, New ZealandPollution: Early morning light shines through the smog in Christchurch, New Zealand

Contaminated: A notice warning of industrial liquid pollution in the Waimakariri river, Christchurch in New ZealandContaminated: A notice warning of industrial liquid pollution in the Waimakariri river, Christchurch in New Zealand

With barely 4.5 million people spread over a mountainous area larger than the United Kingdom or California and more than a quarter of that set aside for reserves and national parks – the backdrop for the popular Lord of the Rings movie trilogy – New Zealand has no shortage of unspoilt natural attractions.

But the marketing overlooks a dark side to the country’s environmental credentials.

More than 60 per cent of New Zealand rivers monitored by the Environment Ministry had ‘poor’ or ‘very poor’ water quality and were rated as unsafe for swimming due to pollution.

Dairy farming, which has a lot riding on New Zealand’s strong environmental reputation, has been a significant cause of poor river quality due to fertiliser and effluent runoff.

Unlike many other countries, New Zealand cows are kept on grassy pastures year-round, a major selling point for its $9billion annual global dairy trade.

 The country has long marketed itself internationally with the '100% Pure' slogan in print and TV ads, drawing millions of visitors each year to experience its national parks, beaches and lakesThe country has long marketed itself internationally with the ‘100% Pure’ slogan in print and TV ads, drawing millions of visitors each year to experience its national parks, beaches and lakes

Facade: New Zealand trades off its natural beauty, but many rivers are too polluted to swim inFacade: New Zealand trades off its natural beauty, but many rivers are too polluted to swim in

Fonterra Kauri plant in Whangarei. A botulism scare at Fonterra was the company's second contamination issue this year after it earlier found traces of dicyandiamde, a potentially toxic chemical, in some productsFonterra Kauri plant in Whangarei. A botulism scare at Fonterra was the company’s second contamination issue this year after it earlier found traces of dicyandiamde, a potentially toxic chemical, in some products

‘Because we’ve had a lack of regulation on farm waste for 20 years it’s been a free for all, so farmers have done what they can to produce more milk – which is to put more cows on pastures,’ said Mike Joy, an ecology and environmental sustainability scientist at Massey University.

Prime Minister John Key, who has been previously criticised for saying the 100% Pure marketing should be taken with a pinch of salt, said New Zealand would always be reliant on dairying, with its natural competitive advantage and global demand rising.

‘The right answer is not for New Zealand to sell less dairy. The right answer is for New Zealand to be absolutely sure that the safety standards are met,’ he said on Tuesday.

More than 60 percent of New Zealand rivers monitored by the Environment Ministry had 'poor' or 'very poor' water qualityMore than 60 per cent of New Zealand rivers monitored by the Environment Ministry had ‘poor’ or ‘very poor’ water quality

While separate from its environmental credentials, New Zealand’s food safety record is also not without stain.

Until the late 2000s, New Zealand had the highest rate in the developed world of food-borne campylobacteriosis, a serious and sometimes deadly disease caused by a bacteria often found in uncooked chicken.

By 2011, even after a major government initiative to control the epidemic, New Zealand still reported incidents of the disease at more than double the rate of nearby Australia and 12 times the rate of the United States, according to the University of Otago.

The botulism scare at Fonterra was the company’s second contamination issue this year after it earlier found traces of dicyandiamde, a potentially toxic chemical, in some products.

Even so, New Zealand has one of the most stringent food safety regimes in the world and the recent dairy product scares only turned up with the sophisticated and sensitive testing available.

Fonterra expects the current contamination issue to be resolved within days.

A protracted, major animal health incident, rather than a localised contamination issue, could wreak havoc on the New Zealand economy.

A decade ago, at the height of a foot and mouth epidemic in Europe, the Reserve Bank of New Zealand modelled the impact of a limited outbreak of the livestock disease – estimating an immediate 20 per cent hit to the currency, as well as a 12 per cent fall in exports and an eight per cent hit to GDP in two years.

‘We’ve got to wake up and look more closely at our green credentials, and work harder to create a pristine environment so consumers can get a product which matches the story,’ said a consultant to New Zealand companies operating in Asia.

‘We can’t be complacent.’

Read more:

Little Ice Age is Coming in 2014

Thursday, March 7, 2013 19:34

The cooling is a significant warning that the globe is headed towards a Little Ice Age (LIA), climate scientists have predicted that 2014 is the beginning of a new age, the earth will go through a series of unstable variations in which global temperature will fluctuate into dangerously cold climate. However with the mass production of current carbon dioxide (CO2) it is unlikely that we will see a major ice age like the one experienced 12,000 years ago.

The Little Ice Age should be the story of the century, yet it’s only being announced quietly by climate scientists and solar physicists. Not a word is being mentioned by the mainstream media, who had a hand in selling the Global Warming propaganda that has become irrelevant as we slide into colder climate. US solar scientists have announced years ago that the sun appears to be headed into a lengthly spell of low activity, which means that the Earth is far from facing a Global Warming catastrophe and actually headed into a Little Ice Age that is said to may last for 60 to 80 years.

Victor Manuel Velasco Herrera, a geophysicist at the University of Mexico agrees by stating that in about 5 years the “Earth will enter a “Little Ice Age” which will last from 60 to 80 years and may be caused by the decrease in solar activity”, The geophysicist slammed the UN Intergovernmental Panel on Climate Change (IPCC) saying their stand on global warming “is incorrect because only are based on mathematical models and presented results at scenarios that do not include, for example, solar activity”.

(Read More: Get The Best Portable Solar Charger on The Market)

It’s not 2014 yet, but we’re already seeing and feeling the signs of the Little Ice Age:

Japan broke a winter record this year, Northern Japan is currently blanketed by unprecedented volumes of snow, more than five meters, houses covered like igloos and roads made into snow tunnels.

Another record for Russia, snow piles up to five meters causing gridlock in Moscow. Link: The Beginning of Spring in Russia! (55 pics) Pic: A house in Russia nearly covered in snow!

A Texas blizzard breaks 120 year old record, hammering the state in February 2013 with 19.1 inches. The blizzard was accompanied by fierce winds in excess of 75 mph. Breaking the record set in Februrary 16, 1983.

Roads in India buried under 100 ft of snow, landlocked passerby’s had to cut through the mass to connect this Himachal Pradesh hill resort to landlocked Lahaul Valley in the Himalayan slopes.

Toronto Canada broke a snowfall record for Februrary 23, 2013 according to Environment Canada. At Pearson International Airport, 12.4 centimeters of snow covered the ground, breaking the record of 7.1 centimeters set in 1967. Ice Boulders in Canada go Viral!:

State of emergency has been recently declared (March 6, 2013) in Virginia where 200,000 are left without power due to the heavy wet snow, some 20 inches had fallen. Winter storm warnings are in effect for much of Virginia and Philadelphia. Much of the Northeast has been bombarded with heavy snow and blizzard activity.

(Read More: Protect Against Blackouts, Get The Best Survival Tool Here)

Now you know the reason why the Global Warming nuts have gone into hiding, ever since Climate Gate the Global Warmists have lost all credibility. Some are still claiming that CO2 is causing Global Warming, and some are now claiming that CO2 is causing Global Cooling, and that it has nothing to do with the lack of solar activity!

Climate change has little to do with carbon dioxide emissions and everything to do with solar activity, CO2 doesn’t heat-up the planet, but it does help “keep the heat” preventing the planet from dipping into a major planetary ice age like the one experienced 12,000 years ago. Dr. James Lovelock explains in the video (below) how greenhouse gases have helped stop the onset of a major ice age.

Meanwhile the internationalists, politicians and global warmists want to continue the fraud, and decrease the amount of CO2 in the atmosphere, I guess they want to go back to the ice ages?

Let’s look at the snow pack progression from 2013-2004, with the help of NOAA’s NOHSRC National Snow Analysis page, you can clearly see the difference that it’s indeed becoming colder.