2009-05-16
As I mentioned before, the Geiger-Müller tube I've been using in my experiments is of unknown properties. I decided to scan its Geiger Plateau with the current lashed-up power supply system. It was a simple matter to break the voltage regulation loop circuit and control the tube voltage by varying the inverter supply voltage.
Disconnecting the 10 Meg feedback resistor and setting up the Americium source at a reasonable distance to supply a suitable count-rate took only moments and soon I was collecting data at a 715 volt supply. Things looked fine, so I scripted up a cron job to periodically graph the accumulating data to a gif file on my workstation's web server and left the room to do other things... Later that evening when I checked the growing graph I was absolutely horrified by what I observed:
Periodic modulation of the activity count! That can't be right... All kinds of crazy ideas went through my head; Some kind of very slow wander of the tube voltage now it is unregulated? Charging of the source/detector system by the ion beam? Massive clock drift? Nope no reasonable explanation could be found. I autocorrelated the data to extract the period:
Approximately 830 seconds, something very familiar about that period... Yep, roughly the same length as my QRSS Beacon cycles. Upon turning off the beacon the effect vanished:
After the obligatory muttering of expletives I set about working out how the beacon RF was affecting the detector system (so I can fix it). Listening to the clicks tells you basically nothing, once it is more than a few per second you lose your ability to really tell anything about fine-scale changes in rate. I couldn't with any degree of confidence hear the modulation by ear. Unfortunately that left me with more unknowns, was it the detector sensitivity being modulated or the counting system glitching? Only integration over a measured time interval can extract that kind of information, so I backed the tube voltage down to 620 volts (where the regulation is likely better) and left the data collection to run over night while I slept on it.
Next day when I checked the data I saw more disturbing trends. The data is far more stable, but still has long term variation much too large to be caused by background variations or the half-life of the source (432.2 years). The setup is open to shack environment, so variations in atmospheric temperature and pressure may modulate the absorption of the alpha particles on their ~20 mm trip to the detector.
To study this further and eliminate any tube voltage variation as a cause I've reinstated the voltage regulation feedback. I'll capture data at this lower but controlled voltage (413 volts) for a few days and see how it behaves. It looks like scanning the tube's Geiger Plateau with the alpha source is a bust for now... I might try it with the source gamma emissions, but they aren't really strong enough to swamp the background variations I've observed. Time to get a better source?
What I really need to do is build the counting device into a proper enclosure. Adding RF screening, etc. Tracking down the perturbing effects is a time consuming process, especially as the current PSU can't supply enough current to allow very high count rates to accumulate data faster. I'm not sure where the tube saturates, but the supply voltage starts to drop significantly above 15 kCPM - not that my current Americium source can give much more than that anyway. A supply capable of more current and tighter regulation is required to do more general experiments with different tubes (I've since aquired a number of rather insensitive small gamma/beta dosimeter tubes who's background count is only about 5 CPM). It would be interesting to compare several over time.
2 comments.