Low Code Data Pipeline: data.world dataset to TigerGraph ingestion using TigerFlow

Mohamed Zrouga
3 min readFeb 4, 2022

In this article, we will share the steps required in order to create a low-code pipeline to get Data flowing from data.world to TigerGraph.

Before Going further :

We will need to have the following requirement :

  • Data.world Account: API access and DataSet to load.
  • Docker Engine: to run TigerFlow container
  • TigerGraph instance: get started today for free with tgcloud.

data.world API Credentials :

After signing up for an account proceed to visit the following link :

https://data.world/settings/advanced

you will need to grab the Read/Write API key.

Start a TigerGraph instance :

For Fresh starters with TigerGraph Cloud, I would recommend reading the following article created by “Akash Kaul

Start a TigerFlow Container and Import The flow :

  • Starting a new container :
docker run -it -p 1880:1880 -v tigerflow-data:/data --name myTigerGraphPipeLine graphflow/tigerflow
  • Import the Flow :

Before you import the flow you will need to install a palette called node-red-contrib-loop using the Menu -> Manage Palettes

After installing the required node you can proceed to import the flow

File -> Import and then copy-paste the flow to the view and hit Import.

After Importing the flow you should find the following view.

  • The first step is to edit the URL to match your datasource user and name :

the general link is “https://query.data.world/sql/<user>/<datasource>”

in this tutorial, we will be using twitter dataset uploaded by the user mgjohns .

Then you need to add the token from the above data.world credential step.

You need to edit the query in the function node to match your data source

and lastly, edit the Node and the Graph credentials.

Watch this video for an in-depth tutorial on how to use tigerflow :

--

--