0
votes

So I'm building a data hub. Basically I want to store large amounts of data in a database.

In order to do that there are these pipes of data that my data providers have to make using my APIs.

I want to secure the APIs so I know who's making the streams and I can limit who makes them. As well as who can send to them. Does it make sense to do OAuth for the non-ingestion API methods and use an API key for the ingestion methods?

OAuth tokens tend to expire and ingestion of data is a long running process. This doesn't feel like the right solution since then there's 2 separate security protocols being used.

The other option I see right now is to force users to check the expiration time of their tokens and then try to refresh them if it's about to expire and they still need to send data.

1

1 Answers

1
votes

OAuth 2.0 allows for Clients that are long-running offline processes via the usage of a refresh token that allows the Client to get a new access token when the current one expires. The user doesn't need to be involved since a proper OAuth 2.0 Client should be able to deal with expired tokens on its own.