On Github rluta / http-nextgen
These are the voyages of the HTTP protocol.
Its continuing mission: to explore strange new devices, to seek out new use cases and new networks,
to boldly go where no browser has gone before!
Arguably the most successful application protocol ever
HTTP/1.1 200 OK Content-Type: text/html; charset=utf-8 Date: Mon, 01 Dec 2014 16:34:12 GMT Server: apache Vary: Accept Cache-Control: public, max-age=7200 Cache-Control: s-maxage=86400 Expires: Mon, 01 Dec 2014 17:34:12 GMT ETag: w/"CEFE1C6A-B5DB-4E65-B965-F6356676FC57" Transfer-Encoding: chunked Content-Encoding: gzip Content-MD5: a76aad98ae2b51c35296a4ab222268db
7 RFCs to define protocol (RFC 7230-7237)
GET / HTTP / 1 . 1 Accept: text/plain ; q= 0.01 ,,,,, ,,,,,,, ,,,,,,, ,, ,, ,, ,, ,,,,,,,,, , ,, ,, ,, ,, ,,, ,, ,,,,,,,,, ,, ,,,,,,,,, , ,, ,, ,, ,,,,,,, ,, ,, ,, ,, ,,,,, ,,,,,,, text/* ; q= 1.00 Host: www.apache.org
GET /js/app.js HTTP/1.1 Host: www.breizhcamp.org Connection: keep-alive Pragma: no-cache Cache-Control: no-cache Accept: */* User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.87 Safari/537.36 Referer: http://www.breizhcamp.org/ Accept-Encoding: gzip, deflate, sdch Accept-Language: fr-FR,fr;q=0.8,en-US;q=0.6,en;q=0.4 HTTP/1.1 200 OK Server: GitHub.com Content-Type: application/javascript; charset=utf-8 Last-Modified: Fri, 18 Mar 2016 16:36:19 GMT Access-Control-Allow-Origin: * Expires: Sat, 19 Mar 2016 09:17:09 GMT Cache-Control: max-age=600 Content-Encoding: gzip X-GitHub-Request-Id: B91F1131:38FF:51D3672:56ED16BD Content-Length: 2480 Accept-Ranges: bytes Date: Sat, 19 Mar 2016 16:00:32 GMT Via: 1.1 varnish Age: 51 Connection: keep-alive X-Served-By: cache-fra1231-FRA X-Cache: HIT X-Cache-Hits: 1 X-Timer: S1458403232.597602,VS0,VE0 Vary: Accept-Encoding X-Fastly-Request-ID: 4ffe3e91de4fbfdbb8a504fbe406735996cb685e
Request - Response only
Synchronous
Limited parallelism (6 to 8 cnx)
Text event streams document type
GET /events HTTP/1.1 Host: localhost:7001 HTTP/1.1 200 OK Content-Type: text/event-steam; charset=utf-8 data: Beam me up, Scotty !
retry: 500 data: Damn you, Scotty data: Beam me up !
id: Scotty-1 event: failure data: The engines are blowing up !
var source = new EventSource("http://localhost:7001/stream"); source.addEventListener('twitter',function (evt) { var obj = JSON.parse(evt.data); $('#twitter').html('
Anything that suits the PubSub architecture
def server = vertx.createHttpServer(), rm = new RouteMatcher() def clients = [] rm.get('/stream') { HttpServerRequest req -> req.response.putHeader('Content-Type','text/event-stream') req.response.putHeader('Access-Control-Allow-Origin','*') req.response.putHeader('Cache-Control','public, no-cache') req.response.chunked = true req.response.write('retry: 1000\nevent: hello\ndata: {"type":"hello"}\n\n') clients << req.response req.response.closeHandler { clients.remove(req.response) } } vertx.eventBus.registerHandler('events') { Message msg -> def jsonBody = new JsonObject((Map)msg.body().data) def dataStr = "event: ${msg.body().type}\ndata: ${jsonBody.encode()}\n\n" clients.each { HttpServerResponse resp -> resp.write(dataStr) } } server.requestHandler(rm.asClosure()).listen(7001)
def twitterFactory = new TwitterStreamFactory().getInstance(); def queries = ['http2']; final StatusListener statusListener = new StatusAdapter() { @Override public void onStatus(Status status) { vertx.eventBus.publish('events',[type:'twitter', data:['id':status.id,'from':status.user.name, 'message':status.text,'lang': status.lang ] ]) } }; def connectTwitterStream(twitter, listener, query) { twitter.cleanUp(); twitter.clearListeners(); twitter.addListener(listener); FilterQuery filterQuery = new FilterQuery().track(query as String[]) .language(['fr','en'] as String[]) twitter.filter(filterQuery); } connectTwitterStream(twitterFactory,statusListener,queries)
Bi-directional, low latency communication for anything
GET / HTTP/1.1 Upgrade: websocket Connection: Upgrade Host: echo.websocket.org Origin: http://www.websocket.org Sec-WebSocket-Key: i9ri`AfOgSsKwUlmLjIkGA== Sec-WebSocket-Version: 13 Sec-WebSocket-Protocol: chat
HTTP/1.1 101 Web Socket Protocol Handshake Upgrade: websocket Connection: Upgrade Sec-WebSocket-Accept: Qz9Mp4/YtIjPcdpbvG8bs= Sec-WebSocket-Protocol: chat
var ws = new WebSocket("ws://localhost:9000") ws.addEventListener('open', function (evt) { console.log("Socket is connected") }); ws.addEventListener('message', function (evt) { receiveMessage(evt.data); }); ws.send('Beam me up, scotty !'); ws.addEventListener('close', function (evt) { console.log("Socket is closed"); });
Anything the server sent events can do and
HTTP (port 80) 67% HTTP (port 61985) 86% HTTPS (port 443) 95%This results in overall sucess rates of 63%, 77% and 87%, respectively. Adam Langley, Google, in IETF TLS mailing-list
Improving the Web performance
POST /request HTTP/1.1 Host: localhost:9000 Accept: text/html, image/jpeg, */* User-Agent: USS Stargazer (NCC-2893) Content-Type: application/json {"name":"Picard","role":"Captain"}
{"name":"Picard","role":"Captain"}
When simply activating SPDY - HTTP/2:
~0 latency, 2k resource, 500k x GET, localhost, 4 clients, using ab + h2load
httpd+h2, Tales of Mystery and Imagination by Stefan Eissing
Can be used just as a better HTTP/1.1
Webperf adjustments may be necessary
Possible advanced usages:
JDK 8 is minimal requirement
JDK 8 SSLEngine doesn't implement ALPN extension (JEP 244)
Current options:
If using a SSLEngine override:
java -Xbootclasspath/p:_path_to_alpn_boot_jar ...
JDK 9 current release target: mid-2017
Link: </style.css>; rel=preload; as=style Link: </app.js>; rel=preload; as=script Link: <https://fonts.example.com/font.woff>; rel=preload; as=font; crossorigin
Use chunkable mime-types for request and response
Server starts reply while request frames are still coming
Implementation examples:
Harder than HTTP/1.1 due to binary and encryption