-
Notifications
You must be signed in to change notification settings - Fork 364
Remote Synchronization Tutorial
DO NOT EDIT THIS PAGE: This page is under heavy active development.
The Quick Start (om.next) introduces Om Next fundamentals. Components, Identity & Normalization covers intermediate concepts. Both of these are necessary reading before proceeding.
While Relay & Falcor both provide good overall caching stories, neither make any attempt to solve the HTTP caching problem. While this may be fine when operating at scale with a large pool of performance experts to draw from, smaller teams must be able leverage standard performance advice. HTTP caching is one of the most tried and true techniques for enhancing web application performance and Om Next supports it out of the box.
Om Next parsing works in two modes. The first which we've already seen takes query expressions and turns them into a data tree ready to be fed into component props. However we haven't yet demonstrated that Om Next also supports a parsing mode to derive new query expressions that cannot be resolved on the client. This feature enables transparent synchronization with a remote service. We'll see this synchronization component in detail later, but first we'll examine how this architecture permits us to recover HTTP caching.
At this point we assume you are now comfortable with Figwheel
configuration and will not cover setup. However note that this
tutorial requires some different dependencies, your project.clj
should look like the following:
(defproject om-tutorial "0.1.0-SNAPSHOT"
:description "My first Om program!"
:dependencies [[org.clojure/clojure "1.7.0"]
[org.clojure/clojurescript "1.7.170"]
[org.omcljs/om "1.0.0-alpha23"]
[org.clojure/core.async "0.2.371" :scope "test"]
[figwheel-sidecar "0.5.0-SNAPSHOT" :scope "test"]])
The big idea is "messaging" - Alan Kay
Imagine we are designing a live dashboard. This live dashboard will attract a large number of users. Most of these users will simply read the stream and will not interact with it. For this reason we would like to serve these users with a cached response. In a traditional application we could imagine serving these users via a JSON payload from the following URL:
/dashboard/items
If a user is logged in, after rendering the unadorned dashboard we may make a second request for a user's dynamic modifications to the stream with a different end point:
/dashboard/user/id/favorites
We would then modify the UI views to reflect this merged state.
The problem is that in Om Next we represent the query as a recursive data structure not a simple URL. So how can we recover the benefits of HTTP caching?
As it turns out we can easily filter out the static part of the message from the dynamic part of the message, the static part of the message can be trivially hashed, and we can make two requests as we did before.
Let's see how! We'll be using the same data as Queries With Unions.
In this part of the tutorial we're not at all concerned with what the UI looks like so we're not providing those parts:
(ns om-tutorial.core
(:require [goog.dom :as gdom]
[goog.crypt :as gcrypt]
[cognitect.transit :as t]
[om.next :as om :refer-macros [defui]]
[om.dom :as dom]
[cljs.pprint :as pprint])
(:import [goog.crypt Sha256]))
(enable-console-print!)
(def init-data
{:dashboard/items
[{:id 0 :type :dashboard/post
:author "Laura Smith"
:title "A Post!"
:content "Lorem ipsum dolor sit amet, quem atomorum te quo"}
{:id 1 :type :dashboard/photo
:title "A Photo!"
:image "photo.jpg"
:caption "Lorem ipsum"}
{:id 2 :type :dashboard/post
:author "Jim Jacobs"
:title "Another Post!"
:content "Lorem ipsum dolor sit amet, quem atomorum te quo"}
{:id 3 :type :dashboard/graphic
:title "Charts and Stufff!"
:image "chart.jpg"}
{:id 4 :type :dashboard/post
:author "May Fields"
:title "Yet Another Post!"
:content "Lorem ipsum dolor sit amet, quem atomorum te quo"}]})
(defui Post
static om/IQuery
(query [this]
[:id :type :title :author :content]))
(defui Photo
static om/IQuery
(query [this]
[:id :type :title :image :caption]))
(defui Graphic
static om/IQuery
(query [this]
[:id :type :image]))
(defui DashboardItem
static om/Ident
(ident [this {:keys [id type]}]
[type id])
static om/IQuery
(query [this]
(zipmap
[:dashboard/post :dashboard/photo :dashboard/graphic]
(map #(conj % :favorites)
[(om/get-query Post)
(om/get-query Photo)
(om/get-query Graphic)]))))
(defui Dashboard
static om/IQuery
(query [this]
[{:dashboard/items (om/get-query DashboardItem)}]))
(defmulti read om/dispatch)
So far so good. Let's take a look at our read method. It does a bit more than we've encountered before. We've also added some simple helpers.
The result of this read function returns not only the value but also
the query for a variety of remotes. Later we'll see that when
constructing the reconciler we'll pass along a list of remotes.
In this case we have our local state provided by :value
, the
:dynamic
portion of the query will be passed along to a remote
service as well as the :static
portion. The only difference is that
we'll use the :static
portion to compute a specific URL.
There is nothing special about these remotes, you can call them whatever you want and list as many as you like.
(defmethod read :dashboard/items
[{:keys [state ast]} k _]
(let [st @state]
{:value (into [] (map #(get-in st %)) (get st k))
:dynamic (update-in ast [:query]
#(->> (for [[k _] %]
[k [:favorites]])
(into {})))
:static (update-in ast [:query]
#(->> (for [[k v] %]
[k (into [] (remove #{:favorites}) v)])
(into {})))}))
(defn sha-256 [s]
(let [sha (Sha256.)
_ (.update sha s)]
(gcrypt/byteArrayToHex (.digest sha))))
(def p (om/parser {:read read}))
(def w (t/writer :json))
(def app-state (atom (om/tree->db Dashboard init-data true)))
Notice that all read (and mutation) functions receive a simple AST
(Abstract Syntax Tree) representing the current portion of the
query. In our map besides returning :value
we can also return
modified query fragments to produce different queries.
Let's see this in action at the REPL:
(p {:state app-state} (om/get-query Dashboard) :static)
The result should be familiar except we're missing the :favorites
key. If you examine the :static
key in the read function above it
should now be clear what's going on. We're changing each query
expression in the union query.
Try the following:
(let [query (p {:state app-state} (om/get-query Dashboard) :static)
json (t/write w query)
hash (.substring (sha-256 json) 0 16)]
(str "/api/" hash))
;; "/api/02e397cc1447d688"
You should see the exact same URL on your machine.
When we write our send
function and we request the :static
portion
of the message we'll use this convenient URL.
Now what about the dynamic query?
(p {:state app-state} (om/get-query Dashboard) :dynamic)
;; [{:dashboard/items
;; {:dashboard/post [:favorites],
;; :dashboard/photo [:favorites],
;; :dashboard/graphic [:favorites]}}]
There's our dynamic query.
Using the same parsing infrastructure we can present a merged view of local state, HTTP cached state, and the user's dynamic query without issue.
In order to understand how all the remote support fits together we will build the simplest possible auto-completion widget. We will elide all the various bits of UX finesse that would distract from understanding how Om Next deals with remote data sources.
Make a file resources/public/index.html
:
mkdir -p resources/public
touch resources/public/index.html
Change the contents of this file to the following:
<!DOCTYPE html>
<html>
<head lang="en">
<meta charset="UTF-8">
<title>Om Tutorial!</title>
</head>
<body>
<div id="app"></div>
<script src="js/main.js"></script>
</body>
</html>
Clear out your om-tutorial.core
namespace. The ns form should now
looking like the following:
(ns om-tutorial.core
(:require-macros [cljs.core.async.macros :refer [go]])
(:require [goog.dom :as gdom]
[cljs.core.async :as async :refer [<! >! put! chan]]
[clojure.string :as string]
[om.next :as om :refer-macros [defui]]
[om.dom :as dom])
(:import [goog Uri]
[goog.net Jsonp]))
We then add some top level definitions and helpers. Our widget will simply auto-complete with search results from Wikipedia:
(enable-console-print!)
(def base-url
"http://en.wikipedia.org/w/api.php?action=opensearch&format=json&search=")
(defn jsonp
([uri] (jsonp (chan) uri))
([c uri]
(let [gjsonp (Jsonp. (Uri. uri))]
(.send gjsonp nil #(put! c %))
c)))
Next let's write our read function for our parser:
(defmulti read om/dispatch)
(defmethod read :search/results
[{:keys [state ast] :as env} k {:keys [query]}]
(merge
{:value (get @state k [])}
(when-not (and (string/blank? query)
(<= 2 (count query)))
{:search ast})))
There are a few non-obvious ideas here. We say that we want to
get remote data if the query
parameter is not blank and the letter
count is greater than or equal to 2. This remote is named
:search
.
Not only do remotes provide a good story for HTTP caching, they also provide a good way to divide up async concerns. A query that is modified as the user types is a good case for enabling specific throttling or other modifications that normal application data fetches will not need.
Let's write our UI code:
(defn result-list [results]
(dom/ul #js {:key "result-list"}
(map #(dom/li nil %) results)))
(defn search-field [ac query]
(dom/input
#js {:key "search-field"
:value query
:onKeyUp
(fn [e]
(om/set-query! ac
{:params {:query (.. e -target -value)}}))}))
(defui AutoCompleter
static om/IQueryParams
(params [_]
{:query ""})
static om/IQuery
(query [_]
'[(:search/results {:query ?query})])
Object
(render [this]
(let [{:keys [search/results]} (om/props this)]
(dom/div nil
(dom/h2 nil "Autocompleter")
(cond->
[(search-field this (:search-query (om/get-params this)))]
(not (empty? results)) (conj (result-list results)))))))
This code is surprisingly declarative - it describes only what, not how. As the user types we change the query. How the data is fetched is someone else's problem.
Let's write the how code.
We will create a core.async go
loop that will listen for new search
queries:
(defn search-loop [c]
(go
(loop [[query cb] (<! c)]
(let [[_ results] (<! (jsonp (str base-url query)))]
(cb {:search/results results}))
(recur (<! c)))))
Pretty straightforward. The cb
is the callback provided by Om Next
itself. This callback simply takes the novelty and merges it back into
the application state.
Next we need our send
function that'll we'll supply to the
reconciler:
(defn send-to-chan [c]
(fn [{:keys [search]} cb]
(when search
(let [{[search] :children} (om/query->ast search)
query (get-in search [:params :query])]
(put! c [query cb])))))
This function takes a core.async channel and returns a suitable send
function. In our case this is not fully generic as we just want to
show the basics. Our send
function only handles search queries.
We want to extract the search string without bothering with the
surface query expression syntax so we leverage
om.next/query->ast
. Getting the information out of the AST is
simple.
Finally we put it all together:
(def reconciler
(om/reconciler
{:state {:search/results []}
:parser (om/parser {:read read})
:send (send-to-chan send-chan)
:remotes [:remote :search]}))
(search-loop send-chan)
(om/add-root! reconciler AutoCompleter
(gdom/getElement "app"))
We supply :send
and we declare the :remotes
present in our
queries.
That's it!
All the complexity around asynchrony has been pushed to the edge of the system. The auto-completion widget is blissful unaware of the details of how data will be fetched making it considerably easier to test and reason about.
The isolation of send
also means that the asynchronous portion of
your application is also easier to mock and test.