DataPlugFacebook - ACCESS_TOKEN


#1

Hi,

I am testing DataPlugFacebook, but cannot figure out how to call an endpoint. In particular, from the docs, [https://github.com/Hub-of-all-Things/DataPlugFacebook], what should be the ACCESS_TOKEN in following endpoint?

localhost:3000?hat_token=$HAT_ACCESS_TOKEN

In HAT 2.0, https://github.com/Hub-of-all-Things/HAT2.0, after deployment, a Postgres table named user_access_token is created and I successfully used the values in that table for testing HAT API (http://hub-of-all-things.github.io/doc/).

Similarly, in the case of DataPlugFacebook, should there be a table/collection storing access_tokens?

I tried this way, but I failed: I ran HAT 2.0, and at the same time, used values of its user_access_token table to test DataPlugFacebook

In addition, DataPlugFacebook uses Mongoldb rather Postgres. Any reasons for this?

For getting data from different sources, I know Apache Camel is a good option, http://camel.apache.org/components.html. What do you think if we can leverage on this tool.

Finally, if I am not wrong, this forum is based on discourse, discourse.org/. If so, it would be nice to enable user login with an Google account. This is a built-int feature of discourse.

Sorry for a long message.

Thank you.


#2

Hey there,

With the recent updates to the DataPlugFacebook the new URI format is as follows (assuming default configuration has not been changed):

http://localhost:3000/facebook?hatUrl=$HAT_URL&hatAccessToken=$HAT_ACCESS_TOKEN

Here, $HAT_URL parameter should be a fully qualified URL string to the HAT server,
and $HAT_ACCESS_TOKEN - an access token found in the user_access_token table in HAT database.

In order to get the plug fully functional you will also have to set up a new application with Facebook and set the App ID and App Secret as your environment variables FB_APP_ID and FB_APP_SECRET, respectively.

Once that is done, the plug should be good to go.

The access tokens to both HAT and Facebook are stored in the Mongo database and used for periodic synchronisation.

Currently, we are also working on a new release to significantly improve both user and developer experience. As a result, current endpoints are subject to further change in the near future.

Hope this helps you with setting up a functioning system.

Regards,
Gus