Nodevember 2015 - Day 2


There was such a good response from my Nodevember 2015 - Day 1 Recap - thank you to all who read and provided feedback! I hope this Day 2 recap is also of benefit :) Let's jump right in :)


Blocking Across The Wire - Kyle Simpson

This talk was mind-bending - not just for me, but for other folks as well:

Jeremy McDuffie's reaction to Kyle's talk

Before I get into the mind-bending stuff that he talked about, which the majority was over my head :p, I'd like to start out with how the talk started.

Kyle's first slide was mentioning a topic he is passionate about - #PrivilegeAwareness. The way he presented it, I feel, was non-confrontational and non-militant - just sharing something that was important to him, take it or leave it. Essentially, he just said it's important to recognize and be aware of the privileges we have had in life that have helped us get where we are.

This hit home with me. Even though I come from a family with a history of drugs, violence, domestic issues, etc... there are breaks that I've received that helped me to be different and to treat my family better. I doubt I'd be where I am without these breaks - although I'm highly self-motivated and driven, that wouldn't have done it alone. Recognizing the privileges you've had doesn't in any way mean that you've had it easy, which is a misconception I have had, it just means you acknowledge that you've had these breaks and are thus empowered to:
a) Show empathy to others who may not have had those breaks and
b) Be mindful to provide those breaks to others where you can.
These two items, to me, are intertwined - without empathy you will not be inclined to help give breaks to others, and if you don't try to give breaks to others then your empathy does no good.

Now, on to the technical part of the talk :) He started out talking about concurrency vs parallelism:

Concurrency: More than one higher level task happening at the same slice of time

Parallelism: More than one higher level task happening at exactly the same moment

He went into how coordination is the difficult part - running separate tasks on separate threads is easy, but running separate tasks that communicate with each other in separate threads is difficult. So, we've gone through an evolutionary period in javascript where we've tried to coordinate concurrency and even simulate parallelism (javascript is single-threaded, so no true parallelism in accordance with the definition given): callbacks, then promises, now generators, soon(ish) async await, and a little thing called Communicating Sequential Processes to help reason about it all! I won't even attempt to go deeper than that, as it's certainly over my head at the moment, but I highly recommend checking out the video that will be posted from the conference!


Making your JS Code Debuggable - Patrick Mueller

This was a solid talk that went over some solid techniques for both making your code debuggable, and doing the debugging. Most items are aimed at Node specifically, but some are applicable to client-side js also. Here are some high points:

  • Use named functions
  • Instead of pyramids of callbacks, call those named functions
  • Use a coding standard utility, i.e. standard, eslint, etc...
  • keep functions short; v8 will "inline" short functions
  • one-line arrow functions - no return or braces needed i.e. [1,2,3].map(x => Math.sqrt(x))
  • console.log - include file name
  • console.trace prints a stacktrace from wherever it's called - even without any error being thrown
  • v8 has a feature called Error.prepareStackTrace
Error.prepareStackTrace = function(err, stackTrace) { 
  //can change formatting of output of trace, etc... 
}

He also made a point that I'll be keeping in mind in the future - "One important optimization that people often miss is to optimize for readability" (emphasis mine).


What Every NodeJS Developer Needs to Know About Elixir - Bryan Hunter

Bryan is a local (to Nashville, TN) and an all-around awesome guy. I've been following him on twitter for a while now, and when seeing his tweets on functional programming and Erlang I'm generally thinking "Oh yeah, I need to look into that sometime". In this talk, he convinced me - I have to look into this Erlang and Elixir stuff! Here are some general points and some comparisons he made between NodeJS and Elixir that I find astounding:

  • With Node, it is a recommendation to not write any blocking code - after all, Node is single threaded with a single event loop.
  • With Elixir, it is impossible to block
  • Node requires constant diligence - i.e. being mindful of the event loop, non-blocking code, etc...
  • Any time you require constant diligence, you are guaranteeing failure
  • Imperative languages mutate state
  • Functional languages transform inputs into outputs

From the Ash: PhoenixJS and WebSockets - Max Beizer

Max went over how PhoenixJS can help make it easy to communicate back to an Elixir backend with WebSockets. He went through the obligatory WebSockets demo of making a chat application, and his talk was great. However, what I think folks will remember the most is that someone from the crowd sent a chat message that was an alert(...), and it totally popped up a browser alert b/c he wasn't sanitizing the messages! One of the most memorable events in any of the sessions! #amirite


Static Sites with React - Robert Pearce

This talk was interesting, and took a turn I didn't expect - static site generators. Robert started out by talking about how things have kind of made a circle:
The first static site generator was Dreamweaver
Then we hit the explosion of data-driven sites and generators lost a ton of popularity
Now, many are expecting generators to be the next big thing

Given the rich server-side rendering capability that React has, it seems to be a really good fit for ushering in a new era of static site generators! So, he has started a project on github called react-static and he gave a demo of it. Definitely worth checking out!


Styling React Components in JS Michael Chan

Michael gave some excellent observations during this talk, my favorite one being "The future is crafted by pain". We tend to focus our energies on solving our pain-points. He went on to describe one such pain point, one which I think all web dev folk can commiserate with: CSS

Pain points (main points from his talk and in bold, supporting scenarios my own as I can't remember his specific ones):

  • You have to hunt down styles that are affecting an element The more classes applied to the element, the merrier the hunting...
  • Recognizing state-specific styles isn't always intuitive - i.e. what does a class of danger actually mean for a given element? It could have different non-apparent meanings for different elements in different contexts, all on the same page...
  • Separation of component and CSS for reusability IS A LIE Agreed - aside from some utility-type classes, the bulk of the styles applied to a component are tied to that component, and that component alone.

So, he walked through what happens if you move the styles over into the javascript. Here are a few pros and cons he gave:

Pros

  • Component-specific styling is right with the component
  • Styles are testable - if the data's state dictates a certain style should be applied, you can render the component to a string and have your unit tests verify it is correct!
  • Minimal CSS files :)

Cons

  • Browser events are more difficult to style - i.e. :hover and :active
  • Media Queries are more difficult

But, for the cons, he gave a few libraries and tools that are in the works to help with these problems.

Aside from the technical content, Michael's presentation style is very enjoyable. He used some very clever images and has a great sense of humor - when the videos get posted, watch the one from this talk and you'll be glad you did :)


Keynote: The Seif Project - Douglas Crockford

I want to start off by saying two things:
a) Seif is pronounced safe
b) Most of this talk was a bit out of my league. Douglas is a brilliant engineer, so I expected that :) Thus, I'll try to be light on opinions, and you can take my opinions lightly ;-)

The talk was very interesting, to say the least. We heard thoughts ranging from "everything is terrible" to "I haven't found anything better". In a nutshell, Douglas pointed out some of the harsh realities regarding security, or the lack thereof, with the interwebs as it stands today. He has a plan to change that. Let's hit some highlights:

First things first, the transition plan...
If we harken back to the days where there was going to be a switch from analog TV to digital, we can see some specific points that helped make this transition a success in America:

  • Broadcasters and Producers forced to purchase updated equipment, with no new revenue sources, to support broadcasting in HD and creating content for HD.
  • Consumers that needed it were provided boxes that would take the soon-to-be digital signal and make it still work on their current televisions.

The focus here was on making the transition as painless as possible for the consumers. This is reasonable given the pain-points being at the broadcaster and producer level is orders of magnitude fewer pain-points than if every consumer had a pain-point. We need to take the same approach when upgrading the web to be more secure.

Then he laid out his plan. I won't go over all the details, but the bottom line is that all URLs you visit would have your unique ID, based off of a private key, as part of the URL to any resource you may go to - i.e. banking, retail, medical, etc... Honestly, I was left a little fuzzy on whether you would have 1 private key per resource, or if you'd have 1 key alone that you allowed each provider to use in signing the public key you'd use for their site/app. But, here are the high points:

  • Use "helper apps" that the browser kicks off at appropriate times (I totally didn't even know this was a thing)
  • Convince 1 progressive browser to integrate
  • Make 1 secure site to require that browser
  • Risk mitigation will compel the other secure sites
  • Competitive pressure will move the other browser providers
  • World will follow for imroved security and faster app dev
  • We'll no longer have to rely on cert authorities, as is the case for SSL. This is good b/c they can't haz teh trusts.
  • Unique ID will be made with uber-encryption and hashing algorithms - ECC 521, AES 256, SHA 3-256 - so will be extremely secure and, given that private key will exist on the device, seems impossible to socially hack.
  • If your device dies or is lost/stolen and you haven't made a redundant copy of your private key(s), then you're SOL.
  • Folks that currently share a device (i.e. a shared cell phone) will need to stop doing that. Actually, they should probably stop doing that now if at all possible..
  • Minimal pain-points for consumers, relatively speaking

It will take some radical changes to upgrade the security of the web - so for him to have a plan and be willing to put it out there, I think it's pretty brave and respectable. There are also some points of concern, as you can see on twitter, the biggest one being anonymity. I definitely recommend watching the video when it's posted.

If I've said anything inaccurate about his talk, feel free to correct me - again, a lot of this was over my head :)


That's it - now on to some personal remarks about the entire weekend :)

Nodevember was absolutely a pleasure and privilige to attend. My brain is overloaded "and loving it", as Maxwell Smart would say. I got to meet some people that have influenced me from afar, and got to reconnect with others. Big shout out to some fellow Nashville folks, who you should totally follow on twitter if you aren't already, Jonathan Creamer, Jeremy McDuffie, and Elijah Manor - was great to catch up with you guys again!

Finally, my most sincere thanks to all of the organizers, volunteers, speakers, and sponsors. Thanks for giving of your time, your energy, your selves :)

-Bradley