Angularjs is evil: dependency injection

Published:

A year ago I was excited of angular and I still think angular is a game changer. I see how community charmed with angular and how easy to start building webapps apps with it. However after a year of using angular I must say that I am not recommending investing time into this framework. I will write several posts about why.

One of things I hate most about angular is its module system. Sadly it is deeply injected in its DNA. Someone decided that DI in JavaScript should look like this:

1
2
3
4
5
6
7
8
function someFactory(stuff){
	//use stuff
}
someFactory.$inject = ['stuff'];

app.factory(['stuff', function(stuff){
	//use stuff
}])

It looks cool. But for me as a JS developer it's too messy and complex (someFactory->someFactoryProvider o_O WTF?!) Here is a dependency injection in pure JS:

1
2
3
4
5
6
function someFactory(stuff){
	return function(){ //we're using closure for DI here
    	//use stuff
    }
}
someFactory(stuff) // injecting stuff

I am using simple closure for dependency injection. It feels more natural. But we still can't replace angular's module system right? Thus we need to use CommonJS (you can use browserify or brunch).

1
2
3
4
var stuff = require('./stuff');
module.exports = function someFactory(){ //and again we use closure
	//use stuff
}

Why is that better? It is simple. It is better way for structuring your app - it gives you references and you can directly say where the dependency lives. It already works well for nodejs. It is compatible with existing package managers.

I know that how JS modules should look like is a debatable topic. I think that CommonJS pattern quite usable and much better fit for JS than AMD or ES6 modules. Though I definitely think that Angular got it wrong.

Reactjs mixing with Backbone

Published:

Reactjs is a javascript librarby for building user interfaces opensourced by facebook just recently.

Not long ago I felt that as a developer I have more or less two best options to build an app. Whether do it in #angular or #backbone. Now I feel that #react is taking best of angular, do it better and allows to use best parts of backbone.

I hate Backbone Views and I hate $scope of angular, especially when it comes to directive scope and all &-@-= stuff. Transclusion and scope is a double hell and I am not talking about digesting and performance yet.

React has really small API and it does one thing, but does it really well. It abstracts DOM for you and optimizes the rendering part. So each time you need react to reflect state changes in the DOM it renders lightweight DOM in javascript and applies only diff to the real DOM. In that way rendering becomes really cheap unlike in angular. And that allows us to build apps with diffenrent patterns in mind.

And here are some tips I got from several weeks of playing around #Reactjs

React is just V

React needs other stuff like routes and models. I am taking them from Backbone.

Models are state

By default React have single state this.state. Which is not usually best solution. It appears that cleaner way is to have multiple states. Where this.state is not persisting state and backbone models are.

In the React's example you can find BackboneMixin but it has some flaws. Following one is better since it does proper cleanup.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
var  ModelMixin = {
  componentDidMount: function() {
    // Whenever there may be a change in the Backbone data, trigger a reconcile.
    this.getBackboneModels().forEach(this.injectModel, this);
  },
  componentWillUnmount: function() {
    // Ensure that we clean up any dangling references when the component is
    // destroyed.
    this.__syncedModels.forEach(function(model) {
      model.off(null, model.__updater, this);
    }, this);
  },
  injectModel: function(model){
    if(!this.__syncedModels) this.__syncedModels = [];
    if(!~this.__syncedModels.indexOf(model)){
      var updater = this.forceUpdate.bind(this, null);
      model.__updater = updater;
      model.on('add change remove', updater, this);
      this.__syncedModels.push(model);
    }
  }
}

In that way you can use same models in several nested components.

1
2
3
4
  <rootComponent user="new UserModel({id: id})">
    <contactComponent user = {this.props.user}/>
    <userpicComponent user = {this.props.user}/>
  </rootComponent>

2 way binding

It's kinda logical to have 2 way binding with those Backbone models. LinkedState plugin is working only for state thus here is BindMixin wich does basically the same as LinkedState but for Backbone models.

1
2
3
4
5
6
7
8
9
10
var  BindMixin = {
  bindTo: function(model, key){
    return {
      value: model.get(key),
      requestChange: function(value){
          model.set(key, value);
      }.bind(this)
    }
  }
}

This mixin adds bindTo method that binds control with model property as simple as this.bindTo(user, 'name'):

1
2
3
4
5
6
7
8
9
10
11
12
13
var Hello = React.createClass({
  mixins:[ModelMixin, BindMixin],
  getBackboneModels: function(){
    return [this.props.instance]
  },
  render: function() {
    var model = this.props.instance;
    return <div>
        <div>Hello {model.get('initial')}</div>
        <input type="text" valueLink={this.bindTo(model, 'initial')}/>
      </div>
  }
});

Here is working example: http://jsfiddle.net/djkojb/qZf48/24/

using private components in compy

Published:

There are core limitations in component that makes hard to use private git repositories directly, unless you use github. Component FAQ proposes to use remotes property and any web server that uses the same urls as Github.

package.json

1
2
3
4
5
6
7
{
  ...
  "compy":{
    ...
    "remotes":["https://user:pass@raw.github.com"]
  }
}

But there is a better way to manage private components with any git server you like.

using git submodules to manage private components

Component supports local dependencies. That means it can serve components from any local folder you put in as local param in config.

package.json

1
2
3
4
5
6
7
8
{
  ...
  "compy":{
    ...
    "paths":["local"],
    "local":["component1","component2"]
  }
}

So if you want to use local dependencies, you should put git submodules with those private components in the folder.

Compy will serve them as usual components and you will manage them with git-cli.

adding component to folder

You can add component to local folder like this:

1
2
cd local;
submodule add git://github.com/chneukirchen/rack.git

socker: websocket CRUD over engine.io

Published:

One of substantial difference of using sockets instead of plain http requests is that we usually broadcast messages without expecting any response. While building jschat I thought it would be ehough. Though even for a chat we need the response if we want reliability and better experience. Think of "Pending" state of a message when sending it in offline mode.

Raw libraries doesn't provide any 'response' like functionality thus I had to build my own implementation. As a base for jschat we are using engine.io because socket.io is not supported for a long time and engine.io is kind of it's successor and it's awesome.

socker

Socker is inspired by express. Simple and lightweight implementation of middlewares, routing and error handling. Socker wrapping both engine.io and engine.io-client and providing additional methods that implement express like API.

setting up socker

We can use engine.io and socker with or without express

1
2
3
4
5
6
7
8
9
10
11
12
13
//backend
var engine = require('engine.io');
var socker = require('socker');

var app = require('express')();
var server = http.createServer(app);
server.listen(nconf.get('server:port'));
server = engine.attach(server);

socker(server); // wrapping server with additional methods
server.on('connection', function(socket){
  socker.attach(socket);// we are attaching socker to the socket
});
1
2
3
4
// frontend
var socket = require('engine.io')('ws://localhost');
var sockerClient = require('socker-client'); // we can use it as standalone though
sockerClient(socket);

sending the message from client

On the client we have additional serve method on the socket

1
2
3
4
5
6
7
8
9
10
11
12
13
14
//socket.serve(<optional> route, <optional> message, <required> callback);

socket.serve({message:"Hello world!"}, function(err, data){
  // err contains error object if it was thrown
  // data is a response data
})
socket.serve('READ /api/item/343', function(err, data){
  // err contains error object if it was thrown
  // data is a response data
})
socket.serve('CREATE /api/items', {itemName : "foo"} function(err, data){
  // err contains error object if it was thrown
  // data is a response data
})

handling the message on server

On the server we additionally have sock.use and sock.when methods. sock.use adds middleware handler. Middleware in our case instead of request and response gives socket and data objects.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
server.sock.use(logger);
function logger(socket, data, next){
  // socket is a socket object
  // data - the data object sent with `request`
  console.log(data);
  
  // socket object have .json method to send a response
  if(weNeedTo) return socket.json({responseMessage: "bar"});
  
  // or we can throw an error
  if(weNeedToThrowError) return next("Error message");
  
  // if we need to pass to next handler
  next() 
}

socket is a message context object. And you can pollute it whenever you like. The "session" context object is socket.__proto__ so If you want to save some data for connection lifetime use prototype object.

handling routing

Inside routing middlewares the route is already parsed and we have also socket.params object with all params from the route.

1
2
3
4
5
6
7
server.sock.when('CREATE /api/items', checkItem, createItem);
server.sock.when('READ /api/item/:id', getItem);
function getItem(socket, data, next){
  // socket.params['id'] contains id from the route
  // data is a data sent
  socket.json({room:"name", id: 343});
}

Using the mask METHOD uri is not required for socker. In the same manner you can name your routes.

1
2
3
server.sock.when('Server, please, give me room with :id', callback);
//or
server.sock.when('Bloody server! I command you to stay on your knees and give all items you got.', callback);

error handling

We can also customize error handling.

1
2
3
4
5
server.sock.use(function(err, socket, data, next){
  if(err){
    socket.json({type:"ERROR", err:err, code: 500})
  }
})

It is important to put type : "ERROR" because that is the way client will treat the message as error.

try it

You got clean and simple API and you got some latency boost. You save roundtrip to your session storage and handshake time. And now with socker moving from express REST API to socket based API is really simple.

why building another app compiler?

Published:

When you are frontend developer and start doing node npm completely spoils you. Because unlike we used to it provides single and predictable way of adding/using 3rd party libs and snippets to your code.

Frontend is more complex in many ways. It is more fragmented since there are html and css additionally to javascript and our code is running in different combinations of vm's and platforms.

Commonly used way of adding 3rd libs is a /vendor folder that holds bunch of unminified (if you're lucky enough) files that were downloaded by someone ages ago. Maybe you will find comments inside that will give you an idea of what version of library is used, maybe not. Also what-load-first dependency management is completely your pain. You might have a master file with all the scripts loaded in 'right' order .

Bower is doing great job adding more metadata to packages fixing some problemts. But Bower is just a package manager (c) and it doesnt load scripts. So again, you need to do additional job defining what-load-when relations.

Even if you will use require.js you need to configure 3rd libraries. Besides requirejs adds it's own complexities into code. For example: do you know the difference between require and define functions? And frankly why do you need to know difference! You need something that just works.

So at the end of a day we need package manager that will deliver libs into our app, require functionality that will handle script dependencies and builder that will wire all the thing together and give back 3 files: index.html, app.js and app.css

and compy can do it

Componentjs does most of work already. Compy just wraps the concept in one solid solution.

component package manager

Componentjs was obvious choice. Unlike npm or bower component is really strict on what files considered a source. Which is not that important for server but really important for frontend.

local require

Componentjs gives local require out of the box. Component's require is synchronous. Your files are wrapped in scope and concatenated in one file. Your dependencies are already loaded then you require them. Thus you don't break javascript and require becomes plain simple and clear.

builder

Builder takes responsibility to compile out three files: app.js, app.css and index.html. app.js is built of js dependencies (components), precompiled templates and js source files. app.css is just concatenated css files and index.html generated automatically to eat js/css and run "main" file. Builder have bunch of plugins that allow to precompile sources. So you can use coffeescript, scss, jade, whatever. And just technically because we avoid on read/write cycle comparing to plain grunt it's faster than grunt. And you can use mix of technologies require coffeescript from js and vice versa.

compy: simple way of building webapps

Published:

Compy is a simple, 'zero' configuration web app builder/compiller integrated with client package manager component. Although there is almost no configuration it gives you all flexibility to code the way you like.

Start

Install compy with npm :

1
$ npm install compy -g

To start an app all you need is to tell compy where is the beginning. To do that you need package.json file with compy.main property pointing to main js file of your app.

1
2
3
4
5
6
{
  "name" : "app",
  "compy" : {
    "main" : "appstart.js"
  }
}

appstart.js file will be executed right after the page load.

To compile app, just run $compy compile

Compy will generate ./dist folder with app.js, app.css and index.html. All css in your directory will be concatenated/minified into app.css file.

Compy have static server so you can check the result with

1
$ compy server [watch]

adding watch option will recompile the app and livereload the changes in a browser.

Components

Most powerful part of compy local require and integration with component.

To install jquery:

1
$ compy install jquerycomp/jquery

to use jquery in code:

1
2
var $ = require('jquery');
$(document.body).html("Hallo world");

Local require will work the same as in node.js

1
2
3
4
//filename: add.js
module.exports = function(a, b){
  return a + b;
}
1
2
3
//filename: appstart.js
var add = require('./add');
add(2,2); //4

Plugins

compy support component's plugins.

Given that you can use those to work with language/template you want. For example to use coffeescript you will need to install plugin in your root folder.

1
$ npm install component-coffee

Now after recompilation all your coffee files will be used as javascript. That also means you can use both js and coffee files in same repo.

1
2
3
#filename: add.coffee
module.exports = (a, b) =>
  a + b
1
2
3
//filename: appstart.js
var add = require('./add');
add(2,2); //4

And there is more

Compy is built ontop of grunt. Basically it is just grunt setup. So no magic here. Though lots of stuff available:

  • components support
  • local require
  • supporting coffeescript, sass, jade and other plugins
  • static server + livereload
  • karma runner
  • grunt extendable

May the force be with you!