advertisement


Concorde - a timely challenge!

Jim Audiomisc

pfm Member
After a gap of almost 50 years I've finally got around to 'decoding' the data recording from the Concorde 001 Solar Eclipse flight in 1973. I can now process the actual measured data, but have a challenge which someone else might be able to clarify for me. So your starter for Mach 2...

The eclipse recording was onto a Revox A77 via some voltage -> frequency convertors. All the hardware is long gone. But I'v now converted the frequency-encoded patterns back into signal voltages. The problem is that one channel provides a 'clock' from the general airplane system and I can't work out how to interpret the patterns!

A section is shown here

http://jcgl.orpheusweb.co.uk/temp/ConcordeTime.png

Note the vertical scale is in terms of frequency produced by the V->F, but essentially means voltage because I've decoded that bit. Question is: how to read the time from the patterns? The regular time-tick with a 2 sec interval is clear. But not the use of two other levels patterned to - presumably - give the elapsed time of the actual time (French standard?) I thought it might be a binary count, but maybe not.

Anyone spot the code?
 
Do you have a longer sequence? There are patterns there, but there's not enough to say what it means.
 
As things stand I snipped the original into a series of 10-min sections for processing. The digital capture of the tape is at 192k/24 and its well over an hour long, so a lot of data. Hence at present the data is in 10 min chunks. I then did conversions of 500 sec per chunk to examine. So this is an example

http://jcgl.orpheusweb.co.uk/temp/left-sum-16

The 'left' channel holds the timing, 'right' is the interferograms. The '16' is because this data averages over 16 cycles of the signal frequency to get the frequency produced over that short period by the V -> F. i.e. the signal level input for recording. If it would help I can do a zip of the series of chunks, but as it stands note there will be regular gaps as I did 500 sec from each 600 sec chunk, central to each chunk. The main interest was the interferograms which I hope to process.

Each line of data gives the time (sec) from the start of the processed chunk, comma, then the average frequency (signal level) during that period.

Because of the 16-averaging the raw data files were rather bigger! But I can redo some of this later if that helps, or make other info available.

What I can do relatively easily is to generate a graph of each 500 sec chunk, showing the patterns. Then specify the time offsets for the start of each graph. (The times shown are relative to zero being the start of the processed sections.) But this will take a while, so probably tomorrow now as I've got othe things I need to do today.

FWIW I'd be happy at some point to make a copy of the data recording available as 192k/24 stereo. But this is too big for me to do via the net, so would need to be via something like a DVD ROM or memory stick/card via post. The wave version essentially fills a DVD, but flac is smaller. Would be nice to have anyone interested be able to have a copy.
 
  • Words transmitted LSB-first
  • negative voltage is mark (1)
  • positive voltage is space (0)
The highest 7 bits are constant as 1.11.11 (1 3 3 octal)

Code:
bits_   octal
.1111.  3 6
.11111  3 7
1..11.  4 6
1..111  4 7
1.1...  5 0
1.1..1  5 1
1.1.1.  5 2
1.1.11  5 3
If there were six more values between line 2 and 3, it would be a simple binary count.

I give the values as octal, as this was made in 1973.

It isn’t binary-coded decimal, as it contains 1111 in the second line, which is invalid for BCD.
 
There's a 13-bit data burst every minute. With positive and negative pulses interpreted as zero and one, it seems to encode an increasing value, LSB first. All the values in the sequences posted by Jim are consistent with the low 6 bits being a minute number, the last one in the csv file being 59. In both sets, the remaining 7 bits are constant.

Is the raw recording publicly available somewhere?
 
I'll have a look at generating something like a 48k/16 flac version of the raw data. That may be small enough for me to put up somewhere for a while. The recording was made on an old Revox A77 and consists of two circa 1kHz squarish-waves whose periods (frequencies) were modulated to indicate the input voltages which hav a low bandwidth down to near-dc. So I suspect 48k/16 will contain sufficient detail for a good examination.

I've wanted to make a version public anyway. But initially wanted to check that the recording *could* be decoded and give me the signal patterns. It does, so that's fine. :)
 
Just did a 48k/16 version and it is still 811MB. So far too big for any webspace I have.

I could downconvert further, but the question is: what would still preserve the required details? The key point is that the signal is a pair of quasi-squarwaves centered on about 1kHz, but modulated over a range. FWIW I did the F -> V decoding by determining the time between zero-crossings (interpolated between pairs that cross the sign range.)

I'll have a think about generating a better format, but ideally, people could have the 'raw' data so they could check anything I did in case I messed up!
 
Think how hard sharing or even copying that data was in 1973.
48/16 is way overkill too

Agree with both comments. I did take the data tape to the old Slough RSRS/RSRE to their 'ADC' rack system, but that didn't give useful results The data in the original paper was therefore generated by quite crude methods, as mentioned in the letter to Nature at the time. TBH back then I was a brand new research student and knew zip about digital processing. And had only been asked to ensure we could record the data OK. Someone else was meant to analyse it.
 
Just did a 48k/16 version and it is still 811MB. So far too big for any webspace I have.

I could downconvert further, but the question is: what would still preserve the required details? The key point is that the signal is a pair of quasi-squarwaves centered on about 1kHz, but modulated over a range. FWIW I did the F -> V decoding by determining the time between zero-crossings (interpolated between pairs that cross the sign range.)

I'll have a think about generating a better format, but ideally, people could have the 'raw' data so they could check anything I did in case I messed up!
If you can get the files to me somehow, I'll be happy to host them on Google Drive.
 
Just did a 48k/16 version and it is still 811MB. So far too big for any webspace I have.

I could downconvert further, but the question is: what would still preserve the required details? The key point is that the signal is a pair of quasi-squarwaves centered on about 1kHz, but modulated over a range. FWIW I did the F -> V decoding by determining the time between zero-crossings (interpolated between pairs that cross the sign range.)

I'll have a think about generating a better format, but ideally, people could have the 'raw' data so they could check anything I did in case I messed up!
If you have Win 10 or install it in a VM you can get a free 5GB of OneDrive cloud storage. You can then replicate your data file(s) to the M$ cloud and share them with whoever.

http://download.microsoft.com/download/7/8/7/78724952-66BE-4645-A086-7CFF1F2D6E2E/GETTING STARTED DOC V8.pdf

https://support.microsoft.com/en-us...-folders-9fcc2f7d-de0c-4cec-93b0-a82024800c07

Cheers,

DV
 
You are carefully encoding tape noise, which FLAC does not compress well. low pass filter a bit.
8 bits of useful dynamic range is more than adequate for this sort of data.
 
It's probably correct that 8 would be OK. However I'm wary of that, and it still leads to fairly large files. However I'll do some more checks and report back.

BTW Afraid I tend to throw away or lose any M$ discs, etc. For well over a decade now I just wipe and discard it.
 
OK, some info on the effect of downsampling the data. To compare I used a 10 min section. At 'full fat' 192k 24 it gives a flac file with the size 202MB. -> 48k16 gives 80MB, -> 48k8 gives 28MB, -> 24k8 -> 11MB

What I don't know is how much that may affect the accuracy of the transitions of the 'square wave' which encode the actual signal vs time. I think 48k16 is pretty much certain to be OK, but don't know about the others.
 
All being well, a copy of the recording should become available in a while. For anyone interested in the timestamping channel I cand add the following info

The "2nd contact" time as viewed from Concorde 001 was 10h53m14s UT and "third contact" was 12h07m13s UT. That repesentes the main period of the eclipse. Day was 30 June 1973.

If anyone is really interested in all the details I can recommend Pierre Lena's book "Racing the Moon's shadow with Concorde 001".
 
This may help a bit

http://jcgl.orpheusweb.co.uk/temp/Clock.zip

I've reduced the time-resolution of the processed result so I can fit the clock pattern data over a longer time into a manageble file size. The zip contains the results for a succession of 10min chunks, so shows the patterns over a longer time. One flaw with this analysis is that the simplistic filering I used to reduce the noise level causes a transient at the start of each 10min section. So anyone looking should disregard that.

Alas, this level of resolution reduction is far too much for decent interferograms to survive, but serves to show the clock patterns.
 
For those interested. The 'audio' signals look like a pair of (quasi) square waves. This is because the actual input voltages for time and interferometer output was passed though a pair of voltage-to-frequency convertors. These nominally give about 1 kHz when you input 0V. So the info is in the *frequency variations* with time of the pair of square waves. i.e. You need to employ a program to convert the frequency variations back into voltage level variation patterns.

I wrote some crude programs to do this just to ensure the patterns *were* containing the input data. They are, but better conversion and analysis is needed. If anyone wants a copy of my 'quick and dirty' programs, just say. They are in 'C', but are pretty clumsy.
 


advertisement


Back
Top