How to get the fd waveform of time series in a data file?

Hello! I am trying to get the ASD of time series in a data file, just like in Tutorial 1.4 of GWOSC#6, where you use "hp, hc = get_fd_waveform(approximant=“IMRPhenomPv2_NRTidalv2"” in order to get the fd waveform of a particular event. In my case, I have a data file where the strain and the time are in two columns and I want to get the time-domain and the frequency-domain plots. So I am not sure where to call and how to call the data file and also, not sure if tutorial 1.4Generating waveforms is the way to get there or tutorial 1.3Q-transforms with GWpy.
Any suggestions?
Thank you!!!

Hi @Lucy

I’m not sure I understand your question. But, it sounds like you have a text file with two columns for strain and time, and you want to:

  • Read in the file
  • Plot it in the time-domain
  • Plot it in the frequency domain

I did this with a sample file, that I downloaded from the quickview app, and named it strain.csv. I was able to do this with the code below. gwpy should be smart enough to read in TXT or CSV files automatically, which is what I do here.

from gwpy.timeseries import TimeSeries
timedata ='strain.csv')
freqdata = timedata.taper().fft()
import numpy as np

To add an ASD, you could do:

asd = timedata.asd()

In this example, the file strain.csv has columns like this:

1 Like

Hi @jonah!
Thank you so much for the help!!! Yes, that is exactly what I wanted to do. I’ll try it on a google colab notebook then.

Hi Jonah,
One more question…can I do this in the google colab? I mean, my file is in my computer and I am guessing that I should call the *.csv file by giving the whole direction of the file, but it is not finding it. Should I necessarily install the gwpy in my computer? There is no way that I upload my files in the “cloud” so google colab can find it?
Thank you!

Hi Lucy,

I believe you can’t open local files on google collab but you may be able to open them if you have them on google drive. I found this recipe but I have not tested it.


I’ll check it out! Thank you @martinberoiz !!!