JavaScript on Rails is here, and it promises to be as good as Ruby on Rails!

After years of an undeserved reputation of being that half-baked, inconsistent scripting language that was used to validate form fields on browsers, JavaScript, or more precisely ECMAScript appears to be progressing in leaps and bounds. Steve Yegge predicted a few months ago that in it's next avatar (ECMAScript 4), it would have what it takes to be the Next Big Language. ECMAScript 4 supports a whole bunch of totally sexy (I can't think of a better adjective) features. To quote from the Wikipedia article:
  • Classes
  • Packages and namespaces
  • Optional static typing
  • Generators and iterators
  • Destructuring assignment (likely)
  • JSON Encoding/Decoding

Not to mention performance improvements as a consequence of the optional static typing.

Steve also obviously believes in putting his code where his mouth is, because he's gone and ported the whole of Rails - yes, you got that right, ported it line by blessed line - to JavaScript. His implementation uses the Rhino engine which runs on the JVM. My guess is this port of Rails to JavaScript will be far more effective than other attempts using mainstream languages like Java. As a language JavaScript is as (if not more) open and expressive as Ruby. If you want an example of JavaScript's expressiveness as a language, go check out the superb jQuery library if you haven't already done so. It will knock you off your feet, I guarantee you.

This just makes the case stronger for bringing business logic to the browser and getting rid go all those annoying get or post parameter based web applications. I mean seriously, if an architect suggested building a desktop thick client where the controllers and models were only on the server and the UI communicated with the controller by passing strings to it to trigger state changes in the model, he'd be considered officially insane. But the vast majority of state interaction type web applications (those with complex domain models) use such an architecture and nobody considers it odd.

Bottom line - once ECMAScript 4 is out and browsers start supporting it, all the 'thick clients are dead, long live the browser' weenies finally have a case. But only because the browser would've stopped being thin.

You may also want to read: Bringing business logic to the browser, or why you should develop in JavaScript

13 comments:

Mark Bradley said...

Isn't the browser made into a 'thin' client by leaving all the computation on the server?

And I believe the reason nobody has questioned this model is because of the relative secureness of the server compared to the browser. e.g. if you implemented security checks (for online banking for instance) in the browser, someone with a hacked version of a browser, or even a firefox with a clever extension, could immediately get access to privileged information.

Unknown said...

Yes, the browser was made 'thin' by leaving the computation on the server. But for a certain class of applications, where the sole advantage of using a browser was easing deployment pains, this was no advantage. A lot of enterprise applications fall in to this category.

Gmail, Google Calendar and Google Docs currently use a model similar to what I've described. So it's already out there, it's just not mainstream yet is all.

I never meant to imply that there will be no adjustments in design or compromises when bringing stuff to the client. Of course, online banking would never have security checks on the client. There would always be a secure third party to do the verification, so that does not change. That's how you'd do it even when developing for the desktop, so the same design principles apply. If someone can still hack this then they could just as easily hack the https form submission used today - since in either case that information has to be sent to the server.

Mark Bradley said...

Oh I see what you're getting at now. I think this is probably why the flash clones (AIR, Silverlight and JavaFX) are coming forward now to steal attention before ECMAScript 4 can take off, which could end up being a big problem for its adoption.

Unknown said...

I hadn't thought of that! But you're right and I couldn't agree more.

Lee said...
This comment has been removed by the author.
Lee said...

Instead of re-inventing the wheel with JavaScript on Rails - why not use Grails?

Groovy has all the features you mention for ECMAScript 4.

Unknown said...

Grails doesn't run on browsers. I'm trying to say that we now have a viable browser based programming language (at least once Gecko and IE become ECMA 4 compliant).

And Blogger needs to change their ' This post has been removed by the author.' to 'This post has been removed by its author.' message.
Right now it looks like I'm censoring comments on my posts. Which I'm not :-).

Anonymous said...

This is not client side! Yegge is using this on the server. It will not even run in a browser as it requires Rhino specific features. It has nothing to do with client side scripting AT ALL.

Gernot said...

I wouldn't consider an architect insane for creating an architecture that locks the domain model into the server. I would call it a good architecture. Passing domain objekts between client and server is bad design, and I'm talking from my own experience. I just finished a large app with a smart client and we chose to expose the domain model to the client. Managing state on the client AND the server gives you a lot of unwanted problems and helps you nothing.

Unknown said...

@anonymous - I consider Yegge choosing JavaScript over, say, Python on the server as an endorsement of JavaScript as a language. Something indicating that it is more than just a 'form validation language'. I never said that his port will run on a browser. I apologise if that was what it sounded like. For the browser you'd use TrimJunction, the GWT or something similar.

@gernot - If you locked your models completely on the server, then it doesn't meet even the marketing definition of a smart client (I'm assuming this was a .Net project? Smart Client is MS marketing jargon for MS SOAP RPC web services combined with DataSets to ensure that the client can still be used even if it is no longer connected to the server... more a marketing pitch and less a technical design)

I've been on or seen multiple projects successfully delivered by my colleagues using both thick clients as well as browsers where models were on the client. We've used .Net WinForms, Java SWING, the GWT... never had a problem.
Google has gmail, docs and calendar which do the same. They don't seem to have a problem either. From my experience, I found these applications to have a richer UI and be faster, more responsive, and easier to modify and refactor.

Bluntly, if you can't manage models on both the client and the server (remember the client - server design which was all the rage before web apps became big?)- something that was being done successfully in Smalltalk almost two decades ago and on Java SWING and VC++ MFCs a decade ago, then that's when you have a bad design.

Kode said...

what an awesome 20% project .steve yeggie the man 8 ) !

Keep Clicking,
Bosky, Javascript Hacker

Unknown said...

Turns out it wasn't a 20% project and John Lam got it wrong when he assumed it was all done by Steve. See Steve's post on the topic.
This is just one part of a mainstream Google project.
Looks like it isn't going open source anytime soon either.

Venkat said...

sounds interesting ..looking forward to see how this competes with Flex et al.