[NCLUG] Software Defined Radio Application

Joseph DiVerdi diverdi at xtrsystems.com
Mon Jan 20 11:09:19 MST 2003


I've been working on a VLF (Very Low Frequency) Software Receiver for use in investigating certain ionospheric phenomena. These investigations involve measuring the *complex* amplitude of several terrestrial VLF transmitters which has been forward scattered off of the ionosphere. The complex amplitude is sensitive to the elevation of the reflecting/refracting layer which in turn is sensitive to solar wind, CMEs (Coronal Mass Ejections), and supposedly to GRBs (Gamma Ray Bursts).

The primary goal of this work is to set up a multi-channel, VLF observatory operating in the range 10-100 kHz using software radio and digital signal processing techniques which can monitor several frequencies simultaneously, measure the complex amplitude of each received frequency, and report these results on-line and real-time using popular Web-based technologies.

To date, an prototype antenna and mast-mounted preamplifier have been built and erected. Cabling has been installed to bring DC power out and RF signals in to the workshop/laboratory. A few distant transmitters have also been detected with some hardware that was cobbled together for this purpose.

Current work has moved indoors to the workshop/laboratory. A Pentium I class computer has been built running (Red Hat) Linux v7.2. The GNU-radio software radio suite has been installed.

In order to make this all work it will be necessary to *continuously* slurp data into the computer at a rate of 200-250ksample/second with a 16 bit/sample data size. The one data stream will be split and processed in multiple channels each by complex multiplication, decimation, demodulation, and various forms of filtering. The result of each stream will be a complex value every 0.1-1.0s.

I haven't settled upon the exact means for slurping this data in to the PC. Several schemes are being considered and evaluated. Today, I'm thinking about the parallel port.

I'm planning to perform the experiments myself very shortly but expect that the results are already known somewhere so I'll ask the question:

What are the limits on sampling rates when data is pumped in to a PC (running Linux) through its parallel port? 

Assuming no subsequent processing, can the port be read (8-bit data only) 400-500k times per second? Continuously? What is the load on the various subsystems (CPU, etc)?

As the sampling times are generated externally (to the PC) via hardware, is interrupt driven data sampling appropriate? Advisable? Effect on maximum sampling rate? Effect on subsystem loads?

Any data, experiences, comments would be greatly appreciated.

Best regards,
Joseph
-- 
Joseph A. DiVerdi, Ph.D., M.B.A.          
http://diverditech.com/           970.980.5868 (voice) 
http://xtrsystems.com/            970.224.3723 (fax)
PGP Key ID: 0xD50A9E33



More information about the NCLUG mailing list