An old friend of mine is one of the authors on this. Their instruments generate a lot of data. I don't know if this is the case for their array in Australia, but when he was working in South Africa they had to carry hard drives out of the country because there wasn't enough fiber capacity out of the continent.
Not necessarily: the car may be there much sooner than the last byte sent via the fiber arrives. Not everything is streaming video; some datasets make most sense only as a whole. Backups are an example.
I think the subtext is, it is easily 1000x more dangerous personally to drive the data along public roads, than to send it over a network. The value in avoiding such danger is perhaps incalculable.
Data can be encrypted to make it safe. Data will be duplicated onto disks before being driven away. It's not like there can only ever be one copy of the data.
I recently found out that that is referred to as the sneakernet, and is actually still a preferred way to transfer a lot of data.
https://en.wikipedia.org/wiki/Sneakernet
If you send someone several TB of random data on hard disk, you can now very securely send them several TB worth of encrypted data. It's many years of audio of reasonable quality, 1tb is about 72 days of raw 44.khz 16 bit stereo audio. With some speech codecs, that could be years of voice data. One time pad can be bigger than a notepad.
My lab still does this (or, will do it when we’re finally allowed back in).
We do a little bit of compression right on the instrument’s computer. After that, a briskly-walking postdoc can move a few terabytes a minute. The exact rate depends on whether they stop for coffee, but it always blows our internal 100 Mbps network out of the water.
I worked for a company CERN had bought equipment from a while back. I was on the technical support team. I had a direct path to engineering when CERN called. The amount of data in an ultra short time was seemingly endless.
So many "need more interfaces, more bandwidth...buffer tuning... buffer tuning".
When a card burnt up mysteriously I made a sci fi joke about black holes and CERN in my ticket to engineering. It was the one and only time I got them to make a joke back ;)
I wish I could remember the joke, but IIRC it was pretty terrible as I was just desperate to get something back from that engineering team that seemed like ... fun.
Fortunately they either didn't care for, missed or just never acknowledged most of my puns and really vague references.
Outside that one interaction the only thing I knew from them was that despite me being not the best technically compared to my peers, they asked I work specific tickets at times... but I think that was mostly because they knew I didn't make things up and would be honest and say "I don't know". Also I'd give them a heads up when I saw something hitting the fan to give them time to prepare ;)
I was doing my PhD when the LHC was being designed. It was interesting to listen to the discussions about signal and data management for the detections closest to the vertex.
The rate was so incredible that the triggers were incredibly optimized to get rid of the noise and keep the fist wave of useful data (it was then processed several times down the road).
The other interesting thing was to see how people were fighting for real estate in the detector, to be able to place their elements somewhere interesting.
I crawled once under one of the detectors (ATLAS) and imagining the several thousand tons above me was frightening.
The MWA has a direct fibre connection from site to a super computing facility in Perth. Although some processing (correlation and averaging) is usually done on site to reduce the data a little, there are some observation modes that send the raw voltages.
This is how Google Street view cars work too, because the sheer size of the high resolution photograpic data is so huge that it would make no sense to transmit it over a mobile network, vs just downloading it once the car finishes it's route.
I was just leaving radio astronomy when the MWA was in later stages of design/deployment, I remember the discussions around designing the data storage for it, and it was centered around the idea of having to fly the data set back and forth (At the time, around 2008-2011ish?) - so it was so large that it wasn't really feasible to use the Internet for each data collection run. On the order of many terabytes if I recall correctly.
That isn't as uncommon as you may think. Especially when talking about DR (disaster recovery) plans, sneakernet is certainly a valid way to get backup data from off-site because you never know what the network status will be when you need to enact your plan.
A question pops in my head whenever I see news of signal detection from the early age of the universe.
As the universe expands we can see further in the past since light has been traveling a longer distance. However, since time has passed to reach the stage where this distance is significant there's a limit to how far we can see, and this limit is continually pushed further as the universe expands (and accelerates expansion).
Now, my question is, how is it possible that we can see so far in the past? Has the early expansion rate of the universe been much faster than what I assume that it's still possible to capture these signals, are we capturing signals spread and reflected over time, or is there some other explanation/wrong assumption?
> Now, my question is, how is it possible that we can see so far in the past?
I think you're overthinking it. Forget about expansion for a moment, or other weird effects. We can see 12 billion years into the past because the universe is far larger than 12 billion (or 24 billion) light years across--possibly infinite, possibly not, but nonetheless at least that large, and at least that large at the moment the signal was emitted. What we're seeing isn't a reflection, it's light that has been traveling 12 billion years in a straight line from point of origin.
Because of expansion, the earth[1] and the emission source may have been closer than 12 billion light years, but they were still very far apart. The big bang doesn't mean the universe exploded from a point source, at least not in any intuitive sense of the phrase. The exact topology is still unknown, but assuming a flat, infinite universe, the universe was already flat and infinite when the signal was emitted.
Well if the furthest stuff we can see all happens to be the same distance away from us, that would mean the universe is larger than the observable, unless were in the center, which, as history shows, is a bad assumption to make.
We are very unlikely to be in the center of the universe, and all evidence points to the universe having no center. The observable universe is centered around the observer, by definition. This article explains it better than I can in a HN comment block: https://www.livescience.com/62547-what-is-center-of-universe...
Thanks. Is red-shift average the same in all directions?
If there were a big-bang, and we were at the edge, different parts would accelerate away from us at different speeds
That would give a measurable center...??
Otherwise, I suppose universe is a sheet of elastic stretched everywhere equally
--
I learnt stuff from the link, it suggests universe wraps like a balloon does in 2D, but with a higher dimension. So center is not visible. They conclude no center, but equally say no evidence for higher dimensions... Of course balloons have centers just need to express them in the right dimensions
So I think they defeat their own argument?? (4D objects still have centers -- just expressed 4D)
But anyway, was mainly trying to find if there is an infinity within reality; or just stupendous immeasurablility?
-- That would be more crazy than being at the center, imho
>If there were a big-bang, and we were at the edge, different parts would accelerate away from us at different speeds
You're operating on a very outdated understanding of what the big bang was - for half a century now we've understood that there was no single point at which the big bang occurred, so there's no center to the universe everything is spreading out from. The big bang happened everywhere at once, and space has continued expanding everywhere at once ever since.
A 3D obj stretched everywhere equally -- I suppose this is the same as the 2D balloon -- as it is also stretched everywhere equally
One says everywhere is equi-distant from the origin in a 4th dimension. The other also says everywhere equi-distant but from zero, but adds a topolgy with no edges
A 3D topology with no edges ofcourse requires a 4th dimension though
So I think these are really two ways of saying the same thing?
--
In any case, it really helped answer some other Qs for me
In a simplistic sense the differences in the Hubble constant, are of course a dimension. So adding dimensions as in string theory is a fancy way of explaining nothing. But I wondered... Is there a correlation between Hubble constant differences and 3D space?
If so, it would (in my unremmitingly small brain) show an interaction with dark energy a.k.a. this 4th dimension
A clearer way to say that: the skin of the balloon has a thickness, and differences in the Hubble constant are measuring that thickness
-- Your answer was awesome and super helpful -- Can you do it a second time? :)
My guess is that these signals are ambient in nature, like microwave background radiation. Like the article mentions, it is assumed that they changed the wavelength over time but in theory they exists.
Imagine it is something like ambient light of a room without a light source during the day
Can someone ELI5 how sparse neutral hydrogen could physically emit signal so strong it could travel this far and be detected? Like, there were no stars, no nuclear reactions, no interactions with energy exchange(?)... Who lost all that energy?
The emission is the so-called ‘forbidden’ hyperfine transition of the spin state of Hydrogen’s one electron. It’s flips from spin up to spin down, and the energy difference is a photon at about 1420 MHz.
It’s called forbidden because it’s extremely unlikely to occur - so unlikely that I don’t believe it’s been observed in the lab. But in aggregate, with vast vast clouds of hydrogen it becomes likely enough and strong enough to detect.
We observe this emission from hydrogen in our own galaxy and through the local universe. And it’s quite strong to easily detect.
In this experiment, they’re looking for that same 1420 MHz signal, redshifted down to something like 180 MHz (where exactly it is isn’t known for sure).
There are 128 grids of 4x4 antennas. There’s nothing special about the 128 figure, and very soon it will be doubled - but there were also talks of a smaller upgrade too.
The 4x4 tile size is a compromise between sensitivity and field of view. 5x5 or 6x6 configurations would be much more sensitive to signal but at the expense of a much smaller field of view, since the field of view is formed using phased arrays. For imaging purposes actually we’d really like the smaller field of view, because it would make it easier to avoid nasty bright radio sources from washing out our observations, but for detecting this epoch if reinionization signal, the bigger field of view is better since it’s a global full-sky signal.
LOFAR uses 24 (or 48) per station for the "high" frequency tiles and 96 for the lower frequency ones if I'm not mistaken, so a power of two is not by definition required.
Is that just to match some characteristic (i.e . number of inputs) of the data aggregation/logging device (more likely), or is it something about the physics of the type of signal they are trying to detect (less likely)?
There are, after all, plenty of possible square grid combinations bit only some are powers of 2.
I would have used that 3/4 times during the last year in a big EU city: we do consulting, and one of our customer is adamant their data cannot go on the public cloud. So they have their own data center, and some file sharing utility. Problem is, you need to go through their VPN, that is super slow, and their sharing utility closes connections after a while.
Result: transferring 30/40 GiB of data is painful, and we often go to their office to retrieve the data instead of using the Internet. In an ideal world, I wouldn't need a service like the one you suggest, but since a lot of our customers seems to be in the Stone Age, I would probably use it.
I have friends who do regular SSD file transfers. Mostly between different countries but sometimes in the same city.
They just use regular messaging services and simple of the shelf encryption software with out of bound key transfer (I.e email...).
Hardware encryption (and platform for key sharing and booking deliveries) could certainly be useful, but I’m not sure about the transport port. I feel like outsourcing it to normal transporters is more reasonable
It’s amazing that we think this is even possible to achieve — pick up a particular signal that’s been floating around since the beginning of the universe and get interesting info from it... would be so cool if they manage it!
Floating around doesn't do it justice....it's been flying at the speed of light away from its origin for the past 12 billion years until it ran into a little sensor.
Not likely. I worked at a research station right down the road from a zoo and we didn’t see any interference from any of the animals. Now, the security fence on the other hand? Static City we used to call it.