Radioactive decay is the process where unstable atoms lose energy through emission of ionizing radiation in the form of particles or photons (gamma radiation). When unstable atoms emit particles they their nucleus changes, which means that the atom transmutes into a different isotope or different element. Ionizing radiation has enough energy to detach electrons from atoms and in that way generate ions. Chemical bonds and reactions depend solely on electrons and for that reason, ionizing radiation can affect chemical bonds, which can degrade materials, but also, in the case of organisms, cause cell or DNA damage (or in the worse case: cancer). Ionizing radiation can be found everywhere, from cosmic rays, to UV light, to radiation generated by naturally occurring radioactive elements. Luckily for us, we have multiple mechanism that make it relatively hard to develop cancer, even if our DNA gets damaged by ionizing radiation.
There are multiple techniques to measure radiation, with capabilities to measure different properties of the radiation, such as its direction, its energy and its type. One of these techniques is the Geiger counter, which only "counts" ionizing radiation events. The main component of the Geiger counter is the Geiger tube, which is a tube filled with a gas mixture to which high voltage (~400 V) is applied. The tube is non-conductive, but once a radioactivity event occurs within the chamber, the gas mixture turns conductive for a very short period (~1 ms). "Classic" Geiger counters emitted a "tick" sound once these event occurred, which depending on the rate, indicated the amount of radiation detected by the tube.
In this project I used a Geiger counter to measure background radiation, to perform some mathematical analyses and also see how different conditions (such as the distance from the tube of a radioactive fossil) affect the count rate.
The Geiger counter setup
For the Geiger counter I used a SBM-20 (СБМ-20) tube, which I bought for less than 10 USD on ebay long ago. These tubes were produced in the Soviet Union during cold war, they have a length of 108 mm, a diameter of 11 mm, use a gas mixture of Neon, Bromine and Argon and require a voltage between 350 and 475 V, and when they detect an event they turn conductive for ~0.5 ms.
The setup that I build to make the measurements looked like this:
Voltage is boosted from 6 V to 400 V, and supplied to the Geiger tube. When a ionizing radiation event occurs, current flows through a Mega-ohm range voltage divider and feed to an MCU ADC through a buffer. The MCU samples at ~18 kHz and sends through a virtual serial port the time when the event occurred to the SBC, which stores the data into a file.
To avoid affecting the measurements I taped the tube to the bench:
Radioactivity is everywhere, so the first thing that I did was to measure the radioactivity at my bench. I measured 68,043 counts during a 38.2 h period, which gave me rate of 29.7 CPM (Counters Per Minute), I also measured the 95% confidence interval of the rate, which was from 29.5 to 29.9 CPM.
Lets talk a bit about the statistical aspects of the radioactivity events. Radioactive decay events are random and are not affected by external conditions (Temperature, pressure, magnetic or electric fields, etc.). Unless the materials that are radioactively decaying have short decay constant or half-life, the radiation events should occur at relatively constant rate (when measuring it using a large number of events in a long time window). These events can be modeled as a Poisson point process, which is a stochastic process that can be specified with a single parameter: the rate of the points, which species the average density of the points.
The distribution of the intervals of a Poisson point process have an exponential distribution, which can be specified with a single rate parameter as well. Since the tube can't detect 2 events with a time separation of < 0.5 ms, the Poisson point process (and the interval exponential distribution) can't perfectly describe the captured data, but at low levels of radiation the chance that 2 events occur with a separation of < 0.5 ms is still quite low.
At this point it was time to test the theory, so I generated an histogram of the interval times and normalized it to generate an empirical probability distribution function (PDF), which was compared to the exponential PDF at 29.7 CPM.
Then I did the same, but with the cumulative distribution function (CDF).
The empirical data matched really well the theory, but to perform more objective tests there are goodness of fit methods such as the Kolmogorov-Smirnov test, but I didn't want to make this blog post more boring. One last thing that I wanted to quickly and roughly check, was if the counting rate changed during time (or technically if the process was stationary), So i plotted the CPM rate for every hour:
And as it can be seen, at least in this time window, the process appears to be stationary.
Of all common objects that I have in the house I found just one that produce a very noticeable count rate increase: a fossilized bone. It generated 237.5 CPM, with a 95% confidence interval of 236.5 - 238.7 CPM when layed over the tube in direct contact with it.
The rest of the object that I tried could barely increase twice the counting rate.
Next I wanted to see how the distance affected the measured radiation. So I measured the rate by separating the fossil from the tube at a distances of 2, 4, 8, 16 and 32 cm:
Background: Mean: 29.7 CPM, CI95: 29.5 - 29.9
2 cm: Mean: 81.0 CPM, CI95: 80.4 - 81.5
4 cm: Mean: 57.2 CPM, CI95: 56.7 - 57.6
8 cm: Mean: 40.8 CPM, CI95: 40.5 - 41.2
16 cm: Mean: 33.3 CPM, CI95: 33.0 - 33.6
32 cm: Mean: 30.6 CPM, CI95: 30.4 - 30.9
To get a rough approximation of the radiation contribution of the fossil I subtracted the background radiation:
2 cm: Mean: 51.3 CPM
4 cm: Mean: 27.5 CPM
8 cm: Mean: 11.1 CPM
16 cm: Mean: 3.6 CPM
32 cm: Mean: 0.9 CPM
The Inverse-square law tells us that if the radiation sensed by the tube should be proportional to the inverse of the squared distance:
So duplicating the distance reduces the radiation flux to 1 / 4. I used the previous differential data to calculate the radiation reduction:
2 cm -> 4 cm: 1 / 1.9
4 cm -> 8 cm: 1 / 2.5
8 cm -> 16 cm: 1 / 3.1
16 cm -> 32 cm: 1 / 3.9
As it can be seen only from 16 to 32 cm the reduction was close to the expected 1 / 4. This is because the law applies only "point sources", and the farther the distance the more the source approximates to a point. Instead of measuring the radiation for a specific distance, it is also possible to do the opposite exercise: measure the distance by measuring the radiation flux. This would allow us to use multiple Geiger tubes to compute the position in space of a a radioactive object through trilateration.
Just for fun I tested 2 barriers in-between the radiation source and the tube: a 1.85 mm wide plastic ruler and a 0.5 wide metallic box:
Background: 29.7 CPM
Unblocked: 57.2 CPM
Ruler: 48.1 CPM
Box: 43.8 CPM
I subtracted the background and also computed how much radiation flew through the barrier to the tube:
Unblocked: 27.4 CPM
Ruler: 18.4 CPM (67.1%)
Box: 14.1 CPM (51.4%)
As it can be seen the metal box blocks more than 3 times the amount of radiation per width than the plastic ruler. These experiments could have been much more interesting, for instance I would have liked to test different metals, but sadly I didn't have much around to perform more interesting experiments.
I built the Geiger counter long ago but never performed any measurements until now. Even though the results that I got were more or less expected, its always rewarding to see how the theory works in the practice. Most of these experiments were very rough and lots of assumptions had to be made, for instance radiation emitted by the barriers was assumed to be negligible, as any barrier emission would have contaminated the results.
Besides using radioactivity to compute the position of a radioactive object in space, another interesting, but not very practical use for a radiation source and Geiger counter is to generate random numbers. One way to generate them is to measure 2 consecutive intervals and if the first one is larger than the second one, generate a bit with a 0, while if the second interval is larger the first one, generate a bit with a 1. In this way 1 bit is generated every 2 counts, so at the maximum rate that I got of 237.5 CPM, I could generate 1.98 random bits per second (ie: 16 s would be required to generate a random 32-bit integer).