Posts

Disco, Simple Git Browsing

Disco is tiny app I built over the weekend to try and make browsing git better.

Inspired heavily by Tower, it also adds pull request integration and is run in the browser. It’s only for public repositories.

Checkout:

http://disco.88cartell.com/#/rails/rails

http://disco.88cartell.com/#/emberjs/ember.js

http://disco.88cartell.com/#/angular/angular.js

Analytics, Errors and Polymer

With Revisit.io being in beta, the priority right now is reviewing what features are being used and responding to feedback/bugs/errors. After looking into a bunch of analytics tools and error tracking apps, it became clear that an app that combines error tracking and analytics would be very useful.

Revisit Analytics and Errors is a custom prototype that combines:

  • Analytics on a per user basis
  • Error tracking
  • Grouping data by day
  • Filter analytics and errors by user and day basis - to show the path a user took to get to any given error

Architecture

Having a tiny data model and wanting to make the app super fast made Golang standout as the right choice for the backend. A bonus was being able to run it on a 512mb server. After writing a lot of Elixir and Orchestration Focused Design, it has become much easier to abstract out common issues with Golang and make it very enjoyable to program in.

There has been a lot of hype about Polymer as a new structure for frontend development recently. The platform has been maturing quite rapidly and it was worth trying in a small project before diving into something bigger.

The architecture for the app is shown below:

2014-08-17 05:23ZCanvas 6Layer 1POLYMERGOLANGIn Memory ArrayCreateEventsHandlerListEventsHandlerRevisit.iocore-ajax Componenttime-group Componentanalytics-event Component

Building Sourcegraph, a large-scale code search engine in Go is a great post on how to structure Golang web applications. Following it meant that only one external package was required for this application: Gorilla / Mux. Best Practices for Production Environments is also a great break down of tips for writing Golang code.

Routing in the backend is quite simple:

router := mux.NewRouter()
router.HandleFunc("/events.json", ListEventsHandler).Methods("GET")
router.HandleFunc("/events.json", CreateEventHandler).Methods("POST")
router.HandleFunc("/create_event.json", CreateEventHandler).Methods("GET")

http.Handle("/", router)

http.ListenAndServe(":3001", nil)

The other big tip with Golang is don’t be fancy. For loops are used over and over in the code for filtering the main array. They are quick and simple. Abstracting them is very tempting but the Golang ethos is to keep programming simple, even if it involves a little bit of repetition. Below is an example:

func filterEventsByDate(events []Event, date string) []Event {
  var eventsByDate []Event

  for i := 0; i < len(events); i++ {
    if events[i].CreatedAt.Format("2006-01-02") == date {
      eventsByDate = append(eventsByDate, events[i])
    }
  }

  return eventsByDate
}

On to Polymer. Polymer lets you structure frontend components in one file. Each file encapsulates its own templating, stylesheets and javascript.

The structure of a Polymer component looks like:

<link rel="import" href="/bower_components/polymer/polymer.html">
<link rel="import" href="/bower_components/core-ajax/core-ajax.html">

<polymer-element name="time-group" noscript>
  <template>
    <style>
      :host .analytics-events {
        position: absolute;
        width: 82.5%;
        top: 4rem;
        right: 0;
      }
    </style>

    <core-ajax auto url="" handleAs="json" on-core-response=""></core-ajax>

    <section class="analytics-events">
      <template repeat="">
        <analytics-event on-select="" event=""></analytics-event>
      </template>
    </section>
  </template>

  <script>
    Polymer('time-group', {
      events: null,

      created: function() {
        this.events = [];
      },

      eventsLoaded: function(event, result) {
        this.events = result.response || [];
        this.eventCount = this.events.length;
      }
    });
  </script>
</polymer-element>

The code above is a skeleton of the real code but you can see that all sections of the code are used to encapsulate the display and behaviour of the component.

Because Polymer uses Shadow DOM, it means that each component is fully encapsulated. This article http://www.html5rocks.com/en/tutorials/webcomponents/shadowdom-201/ has a great explanation of how Shadow DOM styling works and how you can punch through it when needed. The hard thing is that styles can’t leak through from the main stylesheet into the Shadow DOM, which means that all SASS helpers were useless without a small trick. Below is an example:

.events-menu a.logo {
  color: pink;
}

Without Shadow DOM punching

If you explicitly reference the ::shadow pseudo-element then you can create styles that apply across Shadow DOM elements.

::shadow .events-menu a.logo {
  color: pink;
}

With Shadow DOM punching

Polymer leverages components to try and remove as much hand coding of javascript as possible. A good example of this is the core-ajax component. It provides a html tag like interface to making an ajax call. It can call a javascript function or just set a bound variable. If the data doesn’t need any transformation then having it populate a variable that is iterated over is a simple way to list data from a server in plain old tags.

Components can publish the variables that they allow invokers to pass in. And variables in the form of are passed by reference to the component so the template can access all of the nested data in the variable. A simple example is below:

var event = { name: 'Revisit Event' };
<analytics-event event=""></analytics-event>
<link rel="import" href="/bower_components/polymer/polymer.html">

<polymer-element name="analytics-event" attributes="event" noscript>
  <template>
    <style>
    </style>

    <div class="name"></div>
  </template>

  <script>
    Polymer('analytics-event', {});
  </script>
</polymer-element>

Overall Polymer feels like an evolution of nesting controllers in Angular and laying out code in React. Polymer has all the fancy two way binding that Angular has. Most people are using Polymer with Angular for routing. This sounds like a great idea. I definitely think it is a great way to make apps and I’d consider using it over straight Angular.

My next post will be comparing Angular and Polymer with Angular on the same code base.

Building Golang Code on OSX from Homebrew for Linux

After searching for a while I couldn’t find this exact snippet. Use it to build golang code on your mac for linux servers.

export GOROOT=/usr/local/Cellar/go/1.3/libexec # Make sure this gets set at login.
cd $GOROOT/src
GOOS=linux GOARCH=amd64 ./make.bash --no-clean # Only need to do this once.

cd PATH_TO_PROJECT
GOOS=linux GOARCH=amd64 go build FILE.go

Orchestration Focused Design

Recently I’ve started worshipping the Crossfit cult. A lot of people in Crossfit take to social media to share their involvement with the sport. Coming from the programming world it was easy to start following Crossfit people on Instragram and Facebook but I didn’t want to invest the time of:

  • Finding all the right people to follow; and
  • Having to check 2 - 3 feeds for Crossfit news.

For a weekend project I made thechipper.io. This project grabs all of the top Crossfit feeds and puts them in one place.

This post is about coupling not Crossfit. Starting with the architecture of the project.

2014-08-10 22:20ZCanvas 3Layer 1Angular JS FrontendSinatraSequelDatabaseFetch InstagramFetch Facebook

This is a standard architecture for an application of this type but the rules for the design aren’t:

  1. All data objects are structs without any extra methods.
  2. All database models are dumb transaction processors, nothing more.
  3. Logic to be held in services.
  4. Services are to be stateless.
  5. State is to be held in main loops and external interfaces.

The patterns here are very familiar after writing code in Elixir or other functional languages.

The experiment here is if you code with these goals in mind and extract any object oriented-ness out to an adapter setting (for external interfaces only) do you end up writing code that is easier to reason about and maintain?

Serving posts is the simplest slice of code to breakdown:

class Server < Sinatra::Base
  get "/posts.json" do
    page = (params[:page] || 1).to_i
    PostService.all_posts(page).to_json
  end
end

module PostService
  class << self
    def all_posts(page)
      total = Post.order(Sequel.desc(:created_at)).count
      posts = Post.order(Sequel.desc(:created_at)).paginate(page, 40).all
      PostRequest.new total: total, posts: posts, :page => page
    end
  end
end

class Source < Sequel::Model; end
class Post < Sequel::Model; end

class PostRequest < Hashie::Dash
  property :total
  property :posts
  property :page
end
  • All services are modules without any state.
  • All models are dumb structs with transaction processors attached to them or just fancy hashes.

Fetching posts from Instagram or Facebook is more complicated but uses the same design rules.

# fetch_instagram.rb
loop do
  sources = SourceService.for_instagram
  sources.each do |source|
    source = InstragramService.ensure_system_id_is_set source

    InstragramService.posts_for(source.system_id).each do |instagram_attrs|
      PostService.create InstagramToPostTransformer.transform(instagram_attrs)
    end

    sleep 5
  end
end

module InstragramService
  class << self
    def user_id_for(username)
      ...
    end

    def posts_for(system_id)
      ...
    end

    def ensure_system_id_is_set(source)
      ...
    end
  end
end

module SourceService
  class << self
    def for_instagram
      ...
    end

    def for_facebook
      ...
    end

    def update_system_id(source, system_id)
      ...
    end
  end
end

module PostService
  class << self
    def create(post_attrs)
      ...
    end
  end
end

module InstagramToPostTransformer
  class << self
    def transform(instagram_attrs)
      ...
    end
  end
end

I have removed the logic from the code to keep it brief.

fetch_instagram.rb illustrates a different approach to coupling. The typical approach is to have one way coupling where each layer passes data down the stack. In this approach the main loop (fetch_instagram.rb) is the orchestration layer and it has the logic to push and pull data between all of the other layers in the stack.

This becomes a major system design rule, that only orchestration layers can depend on other tiers. There aren’t any other dependencies allowed.

Comparing this approach to that I used in a recent large application:

2014-08-10 23:03ZCanvas 4Layer 1Controller/RouteCoordinatorServiceCommandAdapter

The design of the interface would change to:

2014-08-10 23:08ZCanvas 5Layer 1Controller/RouteServiceServiceCommandAdapter

The goal of this approach is:

  • Controllers/routes tests are automatically full stack tests and can be consistent across the system.
  • Testing functions in every other tier of the system is isolated and requires very little setup.
  • Programmers can forget about what state an object is in and focus on transforming data.

The downside of this is that controllers grow in size but the upside is that complexity is brought to the surface rather than having iceberg style services. It also allows for some tiers to be introduced, like transformers for converting between data structures and validators that just check data integrity.

Summing up, the rules for designing systems in this manner are:

  1. All data objects are structs without any extra methods.
  2. All database models are dumb transaction processors, nothing more.
  3. Logic to be held in services.
  4. Services are to be stateless (i.e. use modules).
  5. State is to be held in main loops and external interfaces.
  6. Only orchestration layers (routers, controllers, main loops) can depend on other tiers.

Having used this on a small system with great success, I’m confident it will help with large system design greatly.

I love flame wars so tweet me your thoughts: @cjwoodward and use the hashtag #orchestrationfocuseddesign.

Web Sockets, Ember and Elixir

Revisit.io is prodominatly made of Elixir and Ember.js code. It has some Node.js and Ruby code thrown in for good measure. One thing that makes it quite different/interesting is that the communication between the client and the application is done with web sockets.

Posting a bookmark in Revisit

When a user posts a bookmark in Revisit the process is broken down into:

2014-08-08 04:56ZCanvas 1Layer 1Submit URLStart Screenshot CaptureRequest TagsBroadcast BookmarkUpload ScreenshotBroadcast Bookmark

What this actually does on the frontend is:

Using web sockets allows Revisit to skip polling for changes and just receive a push message when a bookmark is updated.

The most interesting thing here is broadcast bookmark. It works like a chat server, in that it iterates over the users who should receive the message and then sends the message to each of the open connections:

2014-08-08 04:56ZCanvas 2Layer 1Get FollowersCheck Active ConnecitonSend Bookmark to Follower

Elixir makes it really easy to persist the pid of a connection to the database so it can be associated to a user:

pid = to_string :erlang.pid_to_list(self)

And checking whether a process is still running is simple:

Process.alive?(pid)

Using send with the given pid allows you to send the data back to the socket in Elixir:

send pid, { :message, ExJSON.generate([uuid: "push", payload: App.TimelineEntriesPresenter.present(timeline_entry), type: "timeline_entry"]) }

Once data is being pushed to the client adding support for web sockets to Ember is well documented around the web.

Revisit’s implementation generates a UUID for all requests that are initiated by the client. When a request is initiated by the server a “PUSH” request is received and interpreted by the client. The data is pushed into Ember Data like so:

var payload = serializer.extract(this, type, adapterPayload, null, 'findAll');

this.pushMany(type, payload);

Ember does an amazing job of updating the screen when new data is sent and pushed into Ember Data. Using store.filter is a great way to ensure live data matches the query that was made to the server previously.

Browser gets Sleepy

When a user closes their laptop and then comes back later, they will expect the application responds normally to clicks. Even though the web socket has died (but isn’t erroring).

Luckily, it is simple to trap this in the browser and re-establish the connection.

window.addEventListener("online", establishConnection);

Join the Beta

If you like bookmarks, join the beta: Revisit.io.

subscribe via RSS