Today’s web is a big-event web - Node.JS

Started by Elena_L, 04-02-2018, 00:16:15

Previous topic - Next topic

Elena_LTopic starter

Typical web forms are "big-event" submitters. In other words, lots of data entry and selection happens — a user fills out text boxes, selects choices from combo boxes, selects items from a list, and so on — and then all of that information is submitted to a server. There's a single "big event" from the programming perspective: the submission of all that form data, usually through a POST. That's pretty much how the web operated, pre-Ajax.

Sending lots of data at one time

With Ajax, there is a little more of what's called evented programming. There are more events that trigger interaction with the server. The classic case is the entry of a zip code, and then a resulting call to the server to get a city and state. With Ajax and the  xmlhttprequest, tons of data didn't have to be gobbed up and thrown to the server all at once. However, that doesn't change the reality that the web is still mostly a big-event place. Ajax is used far more often to achieve interesting visuals, do quick validations, and submit forms without leaving a page, than it is to create truly evented web pages. So even though a form isn't submitting a big gob of information with a POST, an Ajax request is doing the same thing.

Honestly, that's only partly the fault of less-than-creative Ajax programmers. Every time you send off a request — no matter how small — there's a lot of network traffic going on. A server has to respond to that request, usually with a new process in its own thread. So if you really move to an evented model, where you might have 10 or 15 individual micro-requests going from a single page to a server, you're going to have 10 or 15 threads (maybe less, depending on how threads are pooled and how quickly they're reclaimed on the server) firing up. Now multiply that by 1,000 or 10,000 or 1,000,000 copies of a given page floating around ... and you could have chaos. Network slowdowns. System crashes.

The result is that, in most cases, the Web needs to be, at a minimum, a medium-event place. The result of this concession is that server-side programs aren't sending back tiny responses to very small and focused requests. They're sending back multiple bits of data, and that requires JSON, and then you're back to the EVAL() problem. The problem is eval(), sure, but the problem is also — from a certain perspective, at least — the nature of the web and threading and HTTP traffic between a web page and a server-side program responding to that request.

Some of you, who are more advanced JavaScript folks are screaming at this point, because you know better than to use eval(). Instead, you're using something like JSON.parse() instead of eval(). And there are also some compelling arguments for careful usage of eval(). These are things worth screaming about. Still, just see how many questions there are surrounding eval() on sites like Stack Overflow and you'll realize that most folks don't use eval() correctly or safely. It's a problem, because there are lots of intermediate programmers who just aren't aware of the issues around eval().)

Sending a little data at all times

Node brings a different approach to the party: it seeks to move you and your web applications to an evented model, or if you like, a "small event" model. In other words, instead of sending a few requests with lots of data, you should be sending tons of requests, on lots of events, with tiny bits of data, or requests that need a response with only a tiny bit of data. In some cases, you have to almost recall your GUI programming. (All the Java Swing folks can finally use their pent-up GUI knowledge.) So a user enters their first and last name, and while they're moving to the next box, a request is already requesting validation of just that name against existing names. The same is true for zip codes, and addresses, and phone numbers. There's a constant stream of requesting and responding happening, tied to almost every conceivable event on a page.

So what's the difference? Why is this possible with Node, and aren't the same issues around threading existent here? Well, no, they're not. Node's own site explains their philosophy the best:

Node's goal is to provide an easy way to build scalable network programs. In the "Hello World" web server example ... many client connections can be handled concurrently. Node tells the operating system (through epoll, kqueue, /dev/poll, or select) that it should be notified when a new connection is made, and then it goes to sleep. If someone new connects, then it executes the callback. Each connection is only a small heap allocation.

Node has no blocks, no threads competing for the same resource, Node is happy to just let things happen however they happen), nothing that has to start up upon request. Node just sits around waiting (quite literally; unused Node responders are sleeping). When a request comes in, it's handled. This results in very fast code, without uber-programmers writing the server-side behavior.

Yes, chaos can ensue

It's worth pointing out that this model does allow all the problems that any non-blocking system allows to come into play: one process (not thread) writing to a data store while another one grabs just-invalidated data; intrusions into what amounts to a transaction; and so on. But realize that the majority of event-based programming on a web form is read-only! How often are you actually modifying data in a micro-request? Very rarely. Instead, there's a constant validation, data lookup, and querying going on. In these cases, it's better to just fire away with the requests. The database itself may add some locking, but in general, good databases will do this much more efficiently than server-side code, anyway; and they'll certainly handle things better than an operating system will spin up and down threads for a generic, "a web response came in" process.

Additionally, Node does have plans to allow for process forking, and the HTML5 Web Workers API is the engine that will probably make this feature go. Still, if you move to an evented model for your web application, you'll probably run into an issue where you might want threading in less than one out of 100 situations. Still, the changes are best in how you think about your web applications, and how often you send and receive data from a server, rather than in how Node works.


Do you want to start your career with JavaScript Node.js then click here: https://goo.gl/Kt1sxh
Elena Lauren (Having expertises on ETL (Clover ETL, SSIS, Pentaho, Talend), Datavisualization (SSRS, Power BI) at Mindmajix
  •