HashHype summarizes internet noise

HashHype is a web app to visualize trending topics in stocks, social, and news. It uses real-time natural language processing to select a single comment which best represents each trend. Users can quickly cut through the chatter and be the first to react. HashHype.com was inspired by the meme stock revolution, and the fact that reality is what we make it, for better or worse.


Is this just hash tags?

The HashHype algorithm goes way beyond just counting stock #ticker @mentions. It analyzes the entire comment, capturing context and sentiment from all words in each sentence. It understands word combos of different lengths, and then clusters of those word combos, respecting order and distance between combos in a cluster. HashHype also understands symbols like emojis, urls, stocks, others. Each aspect of this algorithm is illustrated in the UI so that users can understand the info in a clear and transparent manner.

While HashHype calculates the top ten trends at any given moment, it also snapshots each of those moments into a historical database. That historical data makes the algorithm more accurate as time goes on because trends with high comment volume are not necessarily the most urgent trends. For example, while many people continue to discuss Gamestop, it's no longer a fresh trend that users can benefit from. HashHype applies a “decay” property which lets old trends fade away over time, then new fast moving trends display front and center.


Tech stack details

HashHype recycles many of the lessons learned during my work on HashBack, but simplifies and minifies along the way. At time of writing, HashHype is actively parsing comments from Reddit r/wallstreetbets and Twitter on a single Raspberry Pi computer. Many hours were spent analyzing memory/cpu usage, then iterating on code optimization, and persistence and caching techniques. Why? Let’s just say HashHype’s roadmap may involve hotspots to mine peer to peer information exchange.

In this project I got to wear the full stack hat. I did a good deal of custom coding for front end, back end, ops, and algo work. I even host the pi cluster in my own home with all kinds of redundancies and fail safes (this included me climbing onto our roof to install a 5G antenna - fun stuff). If anyone reading this wants to know more, buy me a beer and brace for max nerd mode.

Why did I pick this project?

We are observing a social evolution that is moving at an unexpected pace. Our modern methods of exchanging information have equated to so much more than just convenience or fun, they shape our existence and predict the future. I am particularly motivated to complete development on social trends before our next presidential election. I hope that my work will offer information that isn’t bound by ulterior motives, and simply helps us be ourselves.

Adobe.com Search App

During my 5+ years at Adobe, the search page was perhaps my favorite project. I took the lead on architecture and engineering and collaborated with teams across the entire company. Some of Adobe’s flagship products with in-app search would send users to us. Sister sites like Behance, HelpX, and others also hooked their search fields into our app. On a busy day we would serve half a million users from all over the world, in all languages. With this kind of traffic, and visibility, the app needed to be rock solid.

Top priorities were not only stability, performance, and accessibility, but the UI would also need to adapt to the user. For example, if the user typed their search in the photoshop app, the search page would show a set of results oriented toward creative users - the entire layout of the page could even change based on context. The goal was to build a single web app to unify the search experience for all Adobe products and users.

Building a transformer app

Since the search app needed to adapt to the user’s context, there were too many cases for engineering to support directly. Therefore, the app would need to be authorable in AEM (squarespace for enterprise). Product, design, or marketing reps could login to author their own version of the app, and their customizations would be saved to a “context object” which became part of the search page url. This made each experience fully portable, and deeplinkable.

Of course, offering this level of customization didn’t come without challenges. I repeatedly found myself in the classic engineering predicament of “Do we manually code this requirement, or take some extra time to build the tooling to support all future requirements like it?”. As usual, there’s a happy medium to be found, but a misstep can lead to an unwieldy codebase and maintenance nightmares. This is particularly true in a large corporation where changing priorities, and conflicting requirements are coming from all directions. In the end, we found that happy medium, but not without constant attention to clear documentation, and communication across all teams and stakeholders.

“I wanna go fast.” -Ricky Bobby

After wading through all the business requirements, it was time for my engineering nerdiness to stretch it’s legs. This app needed to load instantly, and every click, hover, and keystroke needed to have immediate purpose. This was no small task considering the search API team was in active development when I started to integrate.

By the time the search app went live, we were actually sending 5 separate search requests just for page landing (because the API team hadn’t yet supported batched category searches). As they continued to refine the Elasticsearch indexing and query logic, I found ways to compensate. All search requests were concurrent and rendered progressively. They were also cached in session storage (which was particularly helpful when users would bounce between query/filter/sorting/page combinations). There was always meaningful content rendered, and it was nearly impossible for the user to know that requests were often loading in the background.

I was also able to implement a ‘lite’ version of the FE asset package that most Adobe.com domains used (one luxury of being project architect). After some fat was trimmed (or moved to bottom of page), and webpack magic was in place (treeshaking), I had a tidy little bundle with negligible load times. When designers would ask about a loading spinner, I’d happily respond “we don’t need one”. For some time, the search page ranked #1 fastest load times out of all Adobe.com pages, even outperforming pages with static content.

On to a new chapter

I made the decision to leave Adobe this winter. It’s nice to reflect on the wins as I consider the many different projects I worked on over 5 years. Some projects were better than others, and there were some tough lessons learned, as with any job. For me, perhaps the greatest challenge was learning to crawl when I was trained to sprint (a side affect of moving from startup to corporate world). As a creative coder, and somewhat of a perfectionist, It took me a while to adjust to a system of guardrails and speed bumps. However, I’d like to think I’ve emerged as a much more well rounded engineer, and am thankful for those challenges.

When I started the front end team was just a fraction of it’s current size, and it’s web tech has come a long way since then. I was honored to play an integral role in that evolution, and am proud to have mentored some new hires along the way. Big thanks to Adobe, and the folks I worked with. See you out there!

SlyBars, an SPA framework for Adobe Experience Manager

Slybars provides the tools to create single-page apps in AEM. It bundles all of your components into a custom MV* architecture that has built-in lifecycle hooks, event/memory management, and browser support for dynamic HTL (Sightly) templates. Any component you create with Slybars can be rendered both client and server-side.

Built with simplicity in mind, this trimmed down MV* framework includes only what you need. HTL templates can be dynamically rendered client-side against data at any JCR endpoint. It combines advanced web app functionality with the power of AEM server-side rendering, caching, localization, and SEO optimization.

Check out the video below for a technical overview of the challenge that SlyBars attempts to solve.

HashBack - JS App Framework for Distributed Computing

HashBack emulates a multithreaded Node.js environment which can scale to be as large or small as your app requires. It also solves the well known memory constraints of Node. Hashback can serve many purposes, from simple single-page web apps, to machine learning applications. HashBack is open source, but is still in early stage development.

HashBack uses es6 Proxies to maintain a memory efficient virtual tree of all your app's modules, and how they relate. It behaves like an in-memory graph representation of a NoSQL database. The tree automatically distributes itself by sharding into multiple Node processes which run in Docker containers on seperate AWS ec2 instances. All ec2 instances are linked to each other via Redis pub/sub communication which re-unites your sharded graph so that you can interact with it as if it were whole.  

HashBack also has a built-in web server which allows you to serve React views from any node on your tree. They might represent a whole, or part of a page. They automatically maintain real-time data syncing through socket.io. Routing is automatically resolved based on the node's tree location, so you don't have to set that seperately. Hashback is an isomorphic framework that offers high performance with minimal configuration. You just do the fun parts.

Boundless.com Teaching Platform

Boundless is an ed-tech company who offers a collaboration platform for teachers.  Educators can author and edit entire books, or create chapters, quizzes, and assignments.  When educators propose changes to existing content, changes are passed through a git-like versioning system. They are able to review, revert, and jump through version histories as they collaborate.  After edits are complete, they can even use boundless to assign the content to their students, then track progress and grades.

Position: Front-end Engineer
Tech: Ruby on Rails, Backbone, Sass, and real-time socket based interaction

Mean.to Exchange

The Mean.to Exchange was a platform for social commerce. The concept was to create a virtual stock market for photos and video. Users could collect their favorite content, and earn money if it became more popular. Users could build a collage-like portfolio of everything they uploaded, and collected, then track it's growth. The portfolio editor was built from scratch, and offered responsive drag-and-drop functionality and endlessly customizable layouts. Check out the video above to for a short demo.

Position: Founder, Full-stack engineer, Designer
Tech: LAMP (Wordpress), Node.js for photo and video uploading/encoding (FFmpeg), real-time socket based content tracking, CSS3 animations (when they were all the rage)

Sprint's "Store of the Future"

03(2).jpg

Sprint hired SapientNitro to build an in-store shopping experience that could run on any of their devices. I was a contractor at the time and part of the team to build the app in Appcelerator, a framework for cross-platform development. In-store devices ranged from Android and iOS smart devices, to a 60 inch touch screen tv. 

Position: Interactive Developer
Tech: Appcelerator, Backbone, Extensive on-site testing with all target devices

Freelance Clients

Through my own company (Meanwhile Media), I enjoy taking on the occasional freelance contract. Feel free to contact me should you be interested in working together.