Archives

framework vs microlib architecture

Published:

Just recently seems like I understood why holywars between programmers are happening. During revolution I basically saw the same holywar between people that believe that their truth is the only truth. Why even for smart people from both sides it is hard to negotiate for the same vision? I believe that the reason is that they have different values.

We often underestimate how much common values important for us to feel comfortable and productive in a team. We often do not care about having common values while looking for new jobs , we care about salaries more. Agree we make decision of accepting offer based on many factors, shared values as well though we do that unconsciously.

JavaScript community is quite inhomogeneous so you can see all kinds of values there. So far I can distinct two types of people, those who love classical OOP languages with strict and stable structure and patterns and people that love alternative languages that doesn’t have determined patterns and have ‘unexpected’ flexibilities. While talking with first group of people seems like they afraid of chaos and are intolerable for any unpredictability. The second one so bored with structures and solutions that are running away from enterprises like from hell. I know very few people that are ok with both.

So here I am coming to framework vs microlib architecture discourse. Inside JavaScript community itself we have kind of holywar around this topic. Now I feel it doesn’t make sense to participate in this war since final decision is always based on our values and that means that “common sense” that we are appealing to is quite individual.

PS: Main question in any job interview should be around values, always!


Angularjs is evil: overengineering hell

Published:

This I hope is the last post about how Angular will bring you to a world of pain.

Recently I stumbled on this Angularjs hate article that I am totally alligned with on emotional level. Angular is clearly overengineering. And unlike some may say It is not giving you exclusive scalability.

I would compare Angular to React. I know that those are not comparable. But I believe that React based architecture is something that beats Angular solution in terms of simplicity and scalability.

React vs AngularJS by number of concepts to learn

  • React stack: 4 (everything is a component, some components have state, you may use model instead, Commonjs modules, router).
  • AngularJS: 7 (modules, router, controllers, directives, scopes, templates, services, filters).

There are twice more concepts to learn for Angular than React. Not even saying that React’s concepts are much more simple. For example you have controller and directive with templates that do more or less the same things. With React you have component only as a building block of your application. We all know that simplicity scales better, right?

directive vs component

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
//Angular
App.directive('myDirective', function() {
return {
restrict: 'E',
transclude: true,
scope: {
link: '@link'
},
template: '<a href="#/{{link}}" ng-transclude></a>',
link: function (scope, element, attrs) {
//do stuff with scope
}
};
});
//usage
<myDirective link='somewhere'><span>GO</span></myDirective>
1
2
3
4
5
6
7
8
9
10
11
12
13
//React
var myComponent = React.createClass({
componentDidMount: function(){
//do stuff with this.props
},
render: function() {
return <div href={'#/' + this.props.link}>{this.props.children}</div>;
}
});
//usage
<myComponent link='somewhere'><span>GO</span></myComponent>
//JSX transformed
myComponent({link:'somewhere'}, span(null,'GO'));

Substantial difference between React and Angular here is that React is Javascript friendly - you just put stuff as props and since components are functions these are passed as function arguments.

Component is a function!

service vs function

1
2
3
4
5
6
7
8
9
10
11
12
//Angular
myApp.service('unicornLauncher', ["apiToken", UnicornLauncher]);

function UnicornLauncher(apiToken) {

this.launchedCount = 0;
this.launch = function() {
// make a request to the remote api and include the apiToken
...
this.launchedCount++;
}
}

1
2
3
4
5
6
7
8
9
10
11
12
13
//Javascript
var apiToken = require('../apiToken.js');

module.exports = function UnicornLauncher() {

this.launchedCount = 0;
this.launch = function() {
// make a request to the remote api and include the apiToken
...
this.launchedCount++;
}

}

Service/provider in Angular is a solution for a made up problem. Just use CommonJs and you won’t need service/provider thing. You will just use modules and functions that are natural for JS.

Service is a function!

filter vs function

1
2
3
4
5
6
7
//Angular
App.filter('incr', function() {
return function(input) {
return input + 1;
};
})
<div>{ {value | incr} }</div>

1
2
3
4
5
6
//React
function incr(input){
return input + 1;
}

<div>incr(value)</div>

Well, directive is pretty useful if you use html templates as strings. Life is easier with React when you do not use strings for templates.

Filter is a function!

template vs JSX

1
2
3
4
5
//Angular
<div>
<ProfilePic username='username' />
<ProfileLink username='username' />
</div>

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
//Reactjs
/** @jsx React.DOM */
var Avatar = React.createClass({
render: function() {
return (
<div>
<ProfilePic username={this.props.username} />
<ProfileLink username={this.props.username} />
</div>
);

}
});
//transformed
var Avatar = React.createClass({
render: function() {
return (
div(null,
ProfilePic({username:this.props.username}),
ProfileLink({username:this.props.username})
)
);
}
});

Functions are better than strings. Functions could work with closures. Functions are faster. And in javascript functions are first class citizens. Functions are much more logical than strings.

Template is a function!

With react you can live in function world. And with Angular you live in enterprize patterns world.

My next story will be about why Angular might work for you.


managers are taking your project's breath

Published:

Theoretically there is no manager in agile. If someone is telling you that they have agile team with a manager they are lying. That mostly mean that they are not ready for some reason to share responsibility across the team - they do not have agile.

Few years ago I was lucky to participate in @jeffpatton agile workshop. This had huge impact on my understanding of application development. Right now I had embraced some agile basic principles and consider them as healthier for internal group dynamics of a team. Agile is literally makes healthier and happier each individual in the team.

People are lazy. Developers are not exclusions. We do not like to work and take responsibility. We easily give out our responsibilities for anyone who will take em. And worst thing you can do is to give all responsibility of project’s success to one person. That what is happening when you put manager in your team.

On other hand people tend to step up and take responsibilities on products they are building and when they do they become more engaged and proud by stuff they do. They stop asking stupid questions and start committing themselves to the product.

We are developers and we are coming to our jobs and spend eight hours in a day to do magic. We really want to build something that makes sense, that could make world better. Isn’t that the best motivation for us? Business, please, spend time to share your passion about product with us!

PS: There is only one case when having a project manager is a good idea. When you have short time project and manager who takes BA role. It just do not make sense to commit in building a team.


Angularjs is evil: the scope horror

Published:

This is the second post about how angular makes my life painful.

The scope is most one of most complex concepts in Angular. You never know what scope are you in and what is available inside. Things may change if you add anguments somewhere at the top node.

1
2
3
4
5
6
7
8
9
10
var app = angular.module('myApp', []);

function foo($scope) {
$scope.value = "foo";
}

function bar($scope) {
$scope.value = "bar";

}
1
2
3
4
5
<div ng-controller="foo">
<div ng-controller="bar">
{value}
</div>
</div>

You can miss new ng-controller or someone could overwrite your context. It looks like ok, until you have really complex html.

Things become more infernal when you try to create a directive. Directive may have or may not have the scope. Obviously you are passing the stuff to the scope through arguments of the newly created “element”. You can do it in three different ways with craziest API I ever seen - you got three magic symbols & = @ and they do different things with stuff you are throwing inside your arguments. I can only tell that it looks reasonable untill you find alternative that do same thing and make sense in the same time.

And then you try to do what every angular developer does at least once in his life. You take transclusion and try to put directive scope inside transcluded html. Which looks like a great idea at the beginning.

The other thing that I hate about $scope is that I never know when the template will be rendered. It is quite painful to see the magic without knowing what’s happening. I know that I can learn it, my point though is that if I need to learn to embrace it is probably done in a wrong way.


Angularjs is evil: dependency injection

Published:

A year ago I was excited of angular and I still think angular is a game changer. I see how community charmed with angular and how easy to start building webapps apps with it. However after a year of using angular I must say that I am not recommending investing time into this framework. I will write several posts about why.

One of things I hate most about angular is its module system. Sadly it is deeply injected in its DNA. Someone decided that DI in JavaScript should look like this:

1
2
3
4
5
6
7
8
function someFactory(stuff){
//use stuff
}
someFactory.$inject = ['stuff'];

app.factory(['stuff', function(stuff){
//use stuff
}])

It looks cool. But for me as a JS developer it’s too messy and complex (someFactory->someFactoryProvider o_O WTF?!) Here is a dependency injection in pure JS:

1
2
3
4
5
6
function someFactory(stuff){
return function(){ //we're using closure for DI here
//use stuff
}
}
someFactory(stuff) // injecting stuff

I am using simple closure for dependency injection. It feels more natural.
But we still can’t replace angular’s module system right? Thus we need to use CommonJS (you can use browserify or brunch).

1
2
3
4
var stuff = require('./stuff');
module.exports = function someFactory(){ //and again we use closure
//use stuff
}

Why is that better? It is simple. It is better way for structuring your app - it gives you references and you can directly say where the dependency lives. It already works well for nodejs. It is compatible with existing package managers.

I know that how JS modules should look like is a debatable topic. I think that CommonJS pattern quite usable and much better fit for JS than AMD or ES6 modules. Though I definitely think that Angular got it wrong.


Reactjs mixing with Backbone

Published:

Reactjs is a javascript librarby for building user interfaces opensourced by facebook just recently.

Not long ago I felt that as a developer I have more or less two best options to build an app. Whether do it in #angular or #backbone. Now I feel that #react is taking best of angular, do it better and allows to use best parts of backbone.

I hate Backbone Views and I hate $scope of angular, especially when it comes to directive scope and all &-@-= stuff. Transclusion and scope is a double hell and I am not talking about digesting and performance yet.

React has really small API and it does one thing, but does it really well. It abstracts DOM for you and optimizes the rendering part. So each time you need react to reflect state changes in the DOM it renders lightweight DOM in javascript and applies only diff to the real DOM. In that way rendering becomes really cheap unlike in angular. And that allows us to build apps with diffenrent patterns in mind.

And here are some tips I got from several weeks of playing around #Reactjs

React is just V

React needs other stuff like routes and models. I am taking them from Backbone.

Models are state

By default React have single state this.state. Which is not usually best solution. It appears that cleaner way is to have multiple states. Where this.state is not persisting state and backbone models are.

In the React’s example you can find BackboneMixin but it has some flaws. Following one is better since it does proper cleanup.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
var  ModelMixin = {
componentDidMount: function() {
// Whenever there may be a change in the Backbone data, trigger a reconcile.
this.getBackboneModels().forEach(this.injectModel, this);
},
componentWillUnmount: function() {
// Ensure that we clean up any dangling references when the component is
// destroyed.
this.__syncedModels.forEach(function(model) {
model.off(null, model.__updater, this);
}, this);
},
injectModel: function(model){
if(!this.__syncedModels) this.__syncedModels = [];
if(!~this.__syncedModels.indexOf(model)){
var updater = this.forceUpdate.bind(this, null);
model.__updater = updater;
model.on('add change remove', updater, this);
this.__syncedModels.push(model);
}
}
}

In that way you can use same models in several nested components.

1
2
3
4
<rootComponent user="new UserModel({id: id})">
<contactComponent user = {this.props.user}/>
<userpicComponent user = {this.props.user}/>
</rootComponent>

2 way binding

It’s kinda logical to have 2 way binding with those Backbone models. LinkedState plugin is working only for state thus here is BindMixin wich does basically the same as LinkedState but for Backbone models.

1
2
3
4
5
6
7
8
9
10
var  BindMixin = {
bindTo: function(model, key){
return {
value: model.get(key),
requestChange: function(value){
model.set(key, value);
}.bind(this)
}
}
}

This mixin adds bindTo method that binds control with model property as simple as this.bindTo(user, 'name'):

1
2
3
4
5
6
7
8
9
10
11
12
13
var Hello = React.createClass({
mixins:[ModelMixin, BindMixin],
getBackboneModels: function(){
return [this.props.instance]
},
render: function() {
var model = this.props.instance;
return <div>
<div>Hello {model.get('initial')}</div>
<input type="text" valueLink={this.bindTo(model, 'initial')}/>
</div>
}
});

Here is working example: http://jsfiddle.net/djkojb/qZf48/24/


using private components in compy

Published:

There are core limitations in component that makes hard to use private git repositories directly, unless you use github. Component FAQ proposes to use remotes property and any web server that uses the same urls as Github.

package.json

1
2
3
4
5
6
7
{
...
"compy":{
...
"remotes":["https://user:pass@raw.github.com"]
}
}

But there is a better way to manage private components with any git server you like.

using git submodules to manage private components

Component supports local dependencies. That means it can serve components from any local folder you put in as local param in config.

package.json

1
2
3
4
5
6
7
8
{
...
"compy":{
...
"paths":["local"],
"local":["component1","component2"]
}
}

So if you want to use local dependencies, you should put git submodules with those private components in the folder.

Compy will serve them as usual components and you will manage them with git-cli.

adding component to folder

You can add component to local folder like this:

1
2
cd local;
submodule add git://github.com/chneukirchen/rack.git


socker: websocket CRUD over engine.io

Published:

One of substantial difference of using sockets instead of plain http requests is that we usually broadcast messages without expecting any response. While building jschat I thought it would be ehough. Though even for a chat we need the response if we want reliability and better experience. Think of “Pending” state of a message when sending it in offline mode.

Raw libraries doesn’t provide any ‘response’ like functionality thus I had to build my own implementation. As a base for jschat we are using engine.io because socket.io is not supported for a long time and engine.io is kind of it’s successor and it’s awesome.

socker

Socker is inspired by express. Simple and lightweight implementation of middlewares, routing and error handling. Socker wrapping both engine.io and engine.io-client and providing additional methods that implement express like API.

setting up socker

We can use engine.io and socker with or without express

1
2
3
4
5
6
7
8
9
10
11
12
13
//backend
var engine = require('engine.io');
var socker = require('socker');

var app = require('express')();
var server = http.createServer(app);
server.listen(nconf.get('server:port'));
server = engine.attach(server);

socker(server); // wrapping server with additional methods
server.on('connection', function(socket){
socker.attach(socket);// we are attaching socker to the socket
});

1
2
3
4
// frontend
var socket = require('engine.io')('ws://localhost');
var sockerClient = require('socker-client'); // we can use it as standalone though
sockerClient(socket);

sending the message from client

On the client we have additional serve method on the socket

1
2
3
4
5
6
7
8
9
10
11
12
13
14
//socket.serve(<optional> route, <optional> message, <required> callback);

socket.serve({message:"Hello world!"}, function(err, data){
// err contains error object if it was thrown
// data is a response data
})
socket.serve('READ /api/item/343', function(err, data){
// err contains error object if it was thrown
// data is a response data
})
socket.serve('CREATE /api/items', {itemName : "foo"} function(err, data){
// err contains error object if it was thrown
// data is a response data
})

handling the message on server

On the server we additionally have sock.use and sock.when methods. sock.use adds middleware handler. Middleware in our case instead of request and response gives socket and data objects.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
server.sock.use(logger);
function logger(socket, data, next){
// socket is a socket object
// data - the data object sent with `request`
console.log(data);

// socket object have .json method to send a response
if(weNeedTo) return socket.json({responseMessage: "bar"});

// or we can throw an error
if(weNeedToThrowError) return next("Error message");

// if we need to pass to next handler
next()
}

socket is a message context object. And you can pollute it whenever you like. The “session” context object is socket.__proto__ so If you want to save some data for connection lifetime use prototype object.

handling routing

Inside routing middlewares the route is already parsed and we have also socket.params object with all params from the route.

1
2
3
4
5
6
7
server.sock.when('CREATE /api/items', checkItem, createItem);
server.sock.when('READ /api/item/:id', getItem);
function getItem(socket, data, next){
// socket.params['id'] contains id from the route
// data is a data sent
socket.json({room:"name", id: 343});
}

Using the mask METHOD uri is not required for socker. In the same manner you can name your routes.

1
2
3
server.sock.when('Server, please, give me room with :id', callback);
//or
server.sock.when('Bloody server! I command you to stay on your knees and give all items you got.', callback);

error handling

We can also customize error handling.

1
2
3
4
5
server.sock.use(function(err, socket, data, next){
if(err){
socket.json({type:"ERROR", err:err, code: 500})
}
})

It is important to put type : "ERROR" because that is the way client will treat the message as error.

try it

You got clean and simple API and you got some latency boost. You save roundtrip to your session storage and handshake time. And now with socker moving from express REST API to socket based API is really simple.


why building another app compiler?

Published:

When you are frontend developer and start doing node npm completely spoils you. Because unlike we used to it provides single and predictable way of adding/using 3rd party libs and snippets to your code.

Frontend is more complex in many ways. It is more fragmented since there are html and css additionally to javascript and our code is running in different combinations of vm’s and platforms.

Commonly used way of adding 3rd libs is a /vendor folder that holds bunch of unminified (if you’re lucky enough) files that were downloaded by someone ages ago. Maybe you will find comments inside that will give you an idea of what version of library is used, maybe not. Also what-load-first dependency management is completely your pain. You might have a master file with all the scripts loaded in ‘right’ order .

Bower is doing great job adding more metadata to packages fixing some problemts. But Bower is just a package manager (c) and it doesnt load scripts. So again, you need to do additional job defining what-load-when relations.

Even if you will use require.js you need to configure 3rd libraries. Besides requirejs adds it’s own complexities into code. For example: do you know the difference between require and define functions? And frankly why do you need to know difference! You need something that just works.

So at the end of a day we need package manager that will deliver libs into our app, require functionality that will handle script dependencies and builder that will wire all the thing together and give back 3 files: index.html, app.js and app.css

and compy can do it

Componentjs does most of work already. Compy just wraps the concept in one solid solution.

component package manager

Componentjs was obvious choice. Unlike npm or bower component is really strict on what files considered a source. Which is not that important for server but really important for frontend.

local require

Componentjs gives local require out of the box. Component’s require is synchronous. Your files are wrapped in scope and concatenated in one file. Your dependencies are already loaded then you require them. Thus you don’t break javascript and require becomes plain simple and clear.

builder

Builder takes responsibility to compile out three files: app.js, app.css and index.html. app.js is built of js dependencies (components), precompiled templates and js source files. app.css is just concatenated css files and index.html generated automatically to eat js/css and run “main” file. Builder have bunch of plugins that allow to precompile sources. So you can use coffeescript, scss, jade, whatever. And just technically because we avoid on read/write cycle comparing to plain grunt it’s faster than grunt. And you can use mix of technologies require coffeescript from js and vice versa.


compy - simple way of building webapps

Published:

Compy is a simple, ‘zero’ configuration web app builder/compiller integrated with client package manager component.
Although there is almost no configuration it gives you all flexibility to code the way you like.

Start

Install compy with npm :

1
$ npm install compy -g

To start an app all you need is to tell compy where is the beginning. To do that you need package.json file with compy.main property pointing to main js file of your app.

1
2
3
4
5
6
{
"name" : "app",
"compy" : {
"main" : "appstart.js"
}

}

appstart.js file will be executed right after the page load.

To compile app, just run $compy compile

Compy will generate ./dist folder with app.js, app.css and index.html. All css in your directory will be concatenated/minified into app.css file.

Compy have static server so you can check the result with

1
$ compy server [watch]

adding watch option will recompile the app and livereload the changes in a browser.

Components

Most powerful part of compy local require and integration with component.

To install jquery:

1
$ compy install jquerycomp/jquery

to use jquery in code:

1
2
var $ = require('jquery');
$(document.body).html("Hallo world");

Local require will work the same as in node.js

1
2
3
4
//filename: add.js
module.exports = function(a, b){
return a + b;
}

1
2
3
//filename: appstart.js
var add = require('./add');
add(2,2); //4

Plugins

compy support component‘s plugins.

Given that you can use those to work with language/template you want.
For example to use coffeescript you will need to install plugin in your root folder.

1
$ npm install component-coffee

Now after recompilation all your coffee files will be used as javascript. That also means you can use both js and coffee files in same repo.

1
2
3
#filename: add.coffee
module.exports = (a, b) =>
a + b
1
2
3
//filename: appstart.js
var add = require('./add');
add(2,2); //4

And there is more

Compy is built ontop of grunt. Basically it is just grunt setup. So no magic here. Though lots of stuff available:

  • components support
  • local require
  • supporting coffeescript, sass, jade and other plugins
  • static server + livereload
  • karma runner
  • grunt extendable

May the force be with you!