· 4 years ago · Jul 13, 2021, 11:24 PM
1const functions = require("firebase-functions");
2const admin = require("firebase-admin");
3const express = require('express');
4const cors = require('cors');
5const fetch = require('node-fetch');
6// initialize the app
7if (!admin.apps.length) admin.initializeApp({ databaseURL: 'http://localhost:9000?ns=demo-take-home-assignment' });
8
9const app = express();
10// Automatically allow cross-origin requests
11app.use(cors({ origin: true }));
12/*
13 Challenge 1
14*/
15// Get a database reference to our blog
16const db = admin.database();
17
18const fetchTop25 = async (req, res) => {
19 const response = await fetch('https://api.coincap.io/v2/assets')
20 .then(res => res.json())
21 let ids = response.data.slice(0, 25).map(x => x.id)
22 db.ref('top25').set(ids);
23 if (res) {
24 res.send(`
25 <!doctype html>
26 <body>
27 <span id="ids">${JSON.stringify(ids)}</span>
28 </body>
29 </html>`)
30 }
31}
32app.get('/top25', fetchTop25);
33
34/*
35 Challenge 1 Comments:
36
37 Q. Lastly, within the comment section for Challenge #1, describe what would happen if you tried to make this call directly from the frontend? Is this a desired / expected result?
38
39 A frontend is not the source of truth, it makes requests to a backend/cloud that backend/cloud returns a response to the frontend. If a frontend was directly fetching then that would not necessarily update the database/backend/cloud.
40
41 Also, cloud functions cost money every time you use them. A frontend that did not trigger the cloud/backend would not necessarily cost PayClearly money, and it may only display stale data, not up to date information.
42
43 Thirdly, delegating burden of computation to a users device might not always be reasonable as some devices might not be able to support a fetch if the are old or not modern.
44
45 In fact we do fetch from CardTable.svelte, it calls port 5001 which is the Emulator functions port, which is just our svelte front end fetching our own functions data stored in the the realtime cloud database. Even if a frontend has safeguards we want to make sure that users data is what our backend should be complying with, not malice, for instance SQL injection.
46*/
47
48/*
49 Challenge 2
50*/
51const queryDb = (table) => {
52 return new Promise((resolve, reject) => {
53 const ref = db.ref(table)
54 ref.on(
55 'value',
56 snapshot => resolve(snapshot.val()),
57 error => reject(error.name)
58 )
59 })
60}
61
62/*
63 Challenge 2 Comments:
64
65 Lastly, within the comment section for Challenge #2, describe what some of the benefits are to using push ids vs. other indexing strategies? What problems and challenges do they solve?
66
67 Push ids do not write over the data may already exist they instead push or append to a list. This preserves all state from start to finish.
68
69 Push ids when they are created with a timestamp and still manage to be unique, this coupled with preserving order from pushing provides a chronological lookup that is instantaneous and scalable. I think of it like a array you have an index for instant lookup but also like a dictionary (or associative array or hash) you have a key that is not a simple index, but rather a string.
70
71 According the to documentation:
72 "Transactions are slower and more complex. They require one or more round trips to the server. A key generated by `push()` on the client works while offline and is optimized for performance."
73
74 Also, because we are working with realtime data that can be modified by multiple users simultaneously.
75
76 "The unique key is based on a timestamp, so list items will automatically be ordered chronologically. Because Firebase generates a unique key for each blog post, no write conflicts will occur if multiple users add a post at the same time."
77
78 This is a really big deal and the reason it is preferable. As more users are added, a much larger likelihood of conflicts aka ~ "hash collisions" are avoided.
79
80 Challenge 3
81
82 Let's assume we had 1000 cryptocurrencies to update every 10 seconds, but CoinCap's API could not support that request volume. We might need a function that can batch our requests in order to reduce the request volume at any given moment.
83
84 For Challenge #3, we want you to implement your Challenge #2 solution with these API limitations in mind.Let's assume that CoinCap can only process 5 /assets/{id}/ endpoint requests at once. Comment out your Challenge #2 solution, and rewrite your function in a way that batches your requests. These batches should run sequentially (One batch finishes running before the next starts).
85
86 Challenge 3 Comments:
87
88 I can see say 1 million updating every 10 seconds as a potential issue that batching could solve, but I doubt 1000 cryptocurrencies updating at every 10 seconds would be an issue. I would suggest not making several API calls when one when one api call will do. If you fetch without an id it fetches all the data, which would likely be more performant than making n=25 calls. This could save money for PayClearly, if the external 3rd party API costs money at all (or does in the future), which is important for the bottom line.
89*/
90
91const updateTop25Currencies = async (batchSize = 5) => {
92 try {
93 const baseUrl = "https://api.coincap.io/v2/assets"
94 let cryptoIds = await queryDb('top25')
95 if (!cryptoIds) {
96 await fetchTop25()
97 cryptoIds = await queryDb('top25')
98 }
99 for (let i = 0; i < cryptoIds.length; i += batchSize) {
100 const promises = cryptoIds.slice(i, i + batchSize)
101 .map(id => fetch(`${baseUrl}/${id}`)
102 .then(res => res.json())
103 )
104 // [ { data: {...} },...
105 const cryptoDetails = await Promise.all(promises)
106 for (const c of cryptoDetails) {
107 const id = c.data.id
108 db.ref('assets/' + id).set(c.data);
109 }
110 }
111 } catch (err) {
112 console.error(err)
113 }
114}
115setInterval(updateTop25Currencies, 10000)
116
117// Expose Express API as a single Cloud Function:
118exports.api = functions.https.onRequest(app);