How to get (scrape) stock prices/quotes from websites in Google spreadsheets using ImportXml

It is no secret that I like to spend some time on investing, buying and selling stocks and index funds. I use Google spread sheets in order to keep track of my portfolio. Using Google spreadsheets you can get most prices using the built-in function =GOOGLEFINANCE(<insert stockticker>), for example: =GOOGLEFINANCE("CPH:GEN"). I describe this in detail in another post of mine.

However some markets are not supported by this (Oslo stock exchange for example) and especially nordic funds are missing. Therefore I decided to try and scrape this off external sites using ImportXml.

You can find a working examples of the below here.

How get stock qoutes using ImportXml

I first tried my favourite website for charts and quotes tradingview. However they use websockets and load the data asynchronously. Therefore I could not scrape the data from their website as it is not present when the scraper gets the HTML. Other sites like marketscreener or yahoo have the stock prices as part of the initial HTML and their sites can therefore be scraped. In this example I use It is a good idea to find a site that has all the funds you wish to scrape. Even though you of course can get them from different sites.

The fund I started with was "Danske Invest fjernøsten" (not a recommendation for or against). I used the following to import the quotes:

IMPORTXML("";"//td[@class='fvPrice colorBlack']")

The part of the HTML where the price was located, looked like the following:

<td style="padding-bottom:5px" class="fvPrice colorBlack" id="zbjsfv_dr">154.55</td>

The current price as I am writing this is "154.55" as seen in the above XML. The xpath "//td[@class='fvPrice colorBlack']" used to find the price, reads like the following: "Find any td element with the classes fvPrice and colorBlack". I could also have used the Id: "//td[@id='zbjsfv_dr']". This is enough to import the data into your Google spreadsheet. Depending on your locale it may look odd with extra .00 after it or it looking like a completely different number. In some countries a dot (.) sets the decimal place and in others a comma (,) is used. This can cause issues if the site uses a different format than your spreadsheet. Google spreadsheets will maybe (depending on your settings) add the decimals if it believes they are missing, which gives the extra .00.

I therefore ended up trimming any ".00" and then convert the dot to a comma. This I do using the SUBSTITUTE method twice as seen below:

=SUBSTITUTE(SUBSTITUTE(IMPORTXML("";"//td[@class='fvPrice colorBlack']"); ".00"; ""); "."; ",")

Pretty? no, but that is what you get when you try to screen scrape. I should warn that these sort of "integrations" can break if the site you are importing from changes their HTML. If for example they change the classes or id on the element you get the price from, you will get an error.

Below is the end result:


Finding Danish index funds

The function also work for other funds, such as "Sparindex index Japan Value KL" (not a recommendation for or against):

=SUBSTITUTE(SUBSTITUTE(IMPORTXML("";"//td[@class='fvPrice colorBlack']"); ".00"; ""); "."; ",")

Which gives the current price: "97,28" as I am writing this.

Finding Norwegian stocks

Another type of stocks not found in the Google finance API are Norwegian stocks. Prices for these can also be found using ImportXml, such as "Equinor" (not a recommendation for or against):

=SUBSTITUTE(SUBSTITUTE(IMPORTXML("","//td[@class='fvPrice colorBlack']"), ".00", ""), ".", ",")

This gives the price "134,35" as I am writing this.

That is it

You can find a working examples of the above here.

This is my way of importing stock prices from external websites. I hope this can help you to have less to update manually in your spreadsheets! The spreadsheets of course updates these values automatically. Leave a comment down below if you found this helpful - please :) You may also leave a comment if you want help with scraping prices off other websites than mentioned in this post.