Another Friday another diary, take a read below.
Planning
This week started off the same as last week, developing the backend. When reviewing all the features of the system I realised a few features had been forgotten. Once I completed the forgotten features, I started development on the single page web app (SPA).
Additionally at the start of the week, using Trello I planned out all the tasks I predicted I would complete this week. Planning within software development can be incredibly difficult, errors are hard to predict where and when they will happen. The planning of tasks is used as an indicator rather than a fixed schedule. It’s important to maintain flexibility, throughout the week more tasks could be added and/or re-ordered.
Unknown Errors
When building software unexpected errors will inevitably pop up. When performing tests to download data from the OneDrive API an ECONNRESET error triggered. An ECONNRESET error occurs when a TCP connection is unexpectedly closed on the other side, most commonly caused when the other side is overloaded thus kills the connection. Instead of trying to debug the socket, which would result in a lot of wasted time, I took a more logical approach. To fix the error, I eliminated each variable, one by one, until one was left. Continuing to perform more testing I confirmed that the error was occurring due to too many HTTP requests to the same domain.
Concluding it wasn’t the OneDrive causing the error, I theorized it could be the access point. Each HTTP request passes via a public access point, firing multiple requests in quick succession could trigger the access point to refuse a request in an attempt to throttle a users usage. I switched from a wireless to a wired connection and tested again, the error never occurred.
Even though the ECONNRESET error is incredibly rare I still chose to handle it, by sending a 429 status code back to the client, allowing the client to decide how to handle the request. Figuring the cause of the error did take a substantial amount of time which pushed back the completion date back by a day, luckily we planned for such an event.
Async Task
Akin to last week, this week continued to work with NodeJS asynchronous nature. When downloading components from the OneDrive API it can either done synchronously (one by one) or asynchronously (in parallel). Performing the download synchronously ensures the API is never overloaded but is considerably slower, asynchronously is much quicker but can easily overload the server. To achieve each method, both utilise Node JS ‘async/await’ and promises.
To perform each task in order without waiting for each task to complete a simple forEach loop can be used with the anonymous function defined to be asynchronous. In the example below the console.log will fire even if all tasks are not complete.
async function asyncTasks(array) { array.forEach(async (element) => { await task(element); }); console.log('done'); }
To wait for all the task to complete a ‘for of‘ statement can be used. A ‘for of’ statement creates a loop iterating over iterable objects such as a string or an array. In the example below the ‘array’ is the iterable object, only when each task is complete with the ‘console.log’ will fire.
async function asyncTasks(array){ for(const element of array){ await task(element); } console.log('done'); }
To run all the tasks in parallel and wait for the completion of each task a combination of ‘array.map‘ and ‘Promise.all‘ can be used. In the example below the ‘array.map’ creates a new array of the results returned by the function passed in as a parameter. To pass additional parameters to the task function, the last parameter of array.map can be JSON object which can be accessed by the task function with the ‘this’ keyword
async function asyncTasks(array){ const promisesToResolve = array.map(task); await Promise.all(promises); console.log('done'); }
Running too many tasks in parallel can be too intensive for the CPU or memory, to stop this from happening you can ‘array.map’ a subset of the whole array.
Utilizing these three methods when building the backend has been incredibly useful, for example using the ‘Promises.all’ method to add multiple components to the MongoDB database. To read more about Javascript ‘async/await’ loops check out the Lavrton Blog.
Database Specification
From the ground up the system has been built to allow each component to be swapped out, for example, the file storage could be OneDrive or Google Drive, the frontend could be an SPA or a mobile application. Both the file storage and front-end communicate via an API therefore each one could easily be swapped out, on the other hand, the server hosts the database which makes it tied down with the system.
Taking inspiration from OpenGL, a programming interface to render 2D and 3D vector graphics, I developed a database specification which any developer can use. A specification outlines all implemented functions what parameters they should take in and their result but leaves the implementation of each function to the developer.
OpenGL is entirely a specification, it relies on developers to implement it, this allows it to be cross-language and cross platform. The database specification allows developers to implement any type of database needed with the minimal amount of code changes, making the database as swappable as the front end and file storage.
Conclusion
Overall this week has presented itself with a few challenges that took more time to navigate than was predicted. I still continue to improve my programming skills as well as my understanding of asynchronous programming. The next few weeks have been planned out still allowing 2 weeks of extra development time.