from gwosc.datasets import event_gps
from gwpy.timeseries import TimeSeries
gps=event_gps('GW170814')
detectors=['L1','H1','V1']
for place in detectors:
ldata=TimeSeries.fetch_open_data(place, int(gps)-512, int(gps)+512,cache=True)
lq=ldata.q_transform(frange=(30,500),qrange=(4,12),outseg=(gps-0.1,gps+0.1))
lq=lq**0.5
plot=lq.plot()
ax=plot.gca()
ax.set_epoch(gps)
ax.set_yscale('log')
This was my code for Tutorial 1.4. There are certain white patches on my plot that look like this:
Hi @AmbicaG , really strange plot, the only thing I suspect is that some spectrogram values could be out of range or (more likely) invalid numbers, can you try to plot the power spectrogram only, i.e. without doing lq=lq^0.5? Or also, can you check something like numpy.isnan(lq).any() and also numpy.isnan(lq^0.5).any()?
I couldn’t figure it out what’s the problem with TimeSeries.fetch_open_data. I’m not having the “read timeout error” as before. However, the 512 sec window data takes around 28 minutes to download, too long for files with at most 0.4 GB in my internet connection, but okay. I’ve just considered it a fact of life and moved on.
Now, in tutorial 1.4, when I use the igwn-py39 kernel, I only get the following waveforms:
Time domain waveforms: ['TaylorF2NL', 'PreTaylorF2', 'SpinTaylorF2_SWAPPER']
Frequency domain waveforms: ['SpinTaylorF2_SWAPPER', 'TaylorF2NL', 'PreTaylorF2', 'multiband', 'TaylorF2NL_INTERP', 'PreTaylorF2_INTERP', 'SpinTaylorF2_SWAPPER_INTERP']
However, using the igwn-py39-lw kernel, it loads all waveforms. It might be some kind of problem of my (old) Mac OS system with the packages listed in the full environment. I believe this is related to the lalsuite package. I noticed that the lightweight enviroment file has lalsuite=7.11, while the full env file has lalsuite=7.15=pyhd8ed1ab_0. This can be found out using conda compare enviroment.yml with the full environment active. The pip package is also a mismatch, but I don’t think it should be a problem. For the moment, the lightweight enviroment is working for me. I’m just reporting for you guys to check for the consistency of the enviroment files.
@AmbicaG@virtuoso This is because some data in ‘’‘lq=ldata.q_transform’‘’ are negative. So when you do the square root of a negative it counts as a NAN (not a number) and is represented by the white area in the plot. To solve this problem one needs to square root lq’s absolute value instead of square rooting them directly. Use ‘‘lq=lq.abs()**0.5’’’ to do it. One can also plot the power spectrogram only without doing lq^0.5 as lq has no NAN value. Thanks.
You are seeing the chosen shape of the time-frequency pixels. The pixels are spaced logarithmically in frequency, so they get taller as you go higher up the plot. See slides #7 and #8
The normalized energy is the whitened strain data, projected onto each time-frequency pixel. You can see this defined on slide 13.
I am trying to recreate the spectrogram2 in 1.3 tutorial in google colab. But I am unable to get the desired result. I am attaching my code and the output here.
Could anyone help me with what I am doing wrong here?
Hi, everyone! I’m having trouble running Tutorial 1.2 on Google Colab. For some reason, it’s overriding the igwn-py39-lw environment. I get an error when running “ldata.plot()”, as you can see in the image below. Can you help me with that?
For the tutorial 1.3. When I try to run the following instruction :
ax = plot.gca()
ax.set_yscale(‘log’)
ax.set_ylim(10, 1400)
ax.colorbar(
clim=(1e-24, 1e-20),
norm=“log”,
label=r"Strain noise [$1/\sqrt{\mathrm{Hz}}$]",
)
plot # refresh
I got the following error:
AttributeError: ‘NoneType’ object has no attribute ‘_get_renderer’
It’s the same if I jump this part and move forward to the Qtransform. I have tried to find a solution on the Internet, but I couldn’t.
Hi Lucas, you can read the backtrace to find out which line of those is the one causing the problem. I suspect it may be the last line (probably should be plot.plot() but I’m not sure).
If you copy the entire error message (with the backtrace) we could probably help you better.