• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

Remember Lenny

Writing online

  • Portfolio
  • Email
  • Twitter
  • LinkedIn
  • Github
You are here: Home / Archives for Uncategorized

Uncategorized

How I listened to music growing up

February 23, 2017 by rememberlenny


I didn’t listen to music in elementary school. The only other person in my neighborhood my age had an older sister. She gave him CDs, then Weird al Yankovich came into my life. I slowly started listening to music on the radio, and eventually became fascinated with the vinyl records sold at second hand stores. I lived in a rural area, so the number of radio stations were limited.

It wasn’t until middle school that I started to identify with the music I liked. Kazaa and the likes of file sharing became popular. My exposure was limited to the genres I knew about. Everything new was a discovery. I felt like an adventurer, going about into the world. Techno turned drum and base, turned house music, turned DJ mixes, turned electronic trip-hop. Everything was one file at a time.


I burned CDs and listened to them on my portable CD player. First I would burn traditional WAV file CDs, then I eventually got a MP3 reading CD player. This let me fit more songs and create better mixes, but limited where I could listen to them. The worst thing with CDs was not being sure if your burned CD was properly working and wasting tons of blank CDs in the process. There were blank CDs that said they were rewritable, but they never worked. This was also when I learned how to buy CDs in bulk, use online wholesalers, and gauge quality at retailers. (This was also when I was obsessed with energy drinks.)


Then came the discography days. Peer-to-peer downloading sites went from hosting 3–5 megabyte files to 6–7 gigabyte zip directories. I went from downloading individual songs to getting entire band discographies. I discovered Rage Against the Machine, Nirvana, and the Wu Tang Clan. I didn’t have the context to know which albums were the best or what I should listen to first. I had hit-or-miss experiences. Sometimes missing the well known songs from bands and falling in love with others that no one else knew.


I owned a Mini-Disk player, which was all the rage in Japan, and downloaded everything cool I could find. Mini-Disks were CDs that read MP3s. The fact that you could rewrite the contents of the Mini-Disks made it incomparably better choice when compared to CDs. The best part at the time was being able to create a Mini-Disk and then share it with other friends who had a Mini-Disk player. The worst part was not being able to take the files off the disks. The software for managing the Mini-Disks were all pretty bad Sony based tools.

Early through high-school, I remember finding the download packs of 100 top billboard songs. These packages were a new way to discover bands I didn’t know. I would get 100 songs from a genre I was interested in, and then look for the discography of the bands I liked. If I couldn’t find the discography, I often wouldn’t listen to the band. This odd behavior was the culmination of my previous habits and resulted in missing great bands.


I got my first MP3 player. It was a cheap plastic USB based device that could run for 4–6 hours. The entire device was no bigger than a pocket flashlight. The only ports on the device were a built in USB port and a 3.5 inch jack for headphones. The best part about these MP3 players were the ability to put music on and then take it off somewhere else. You didn’t need software to use the device, because it acted like a USB disk. The design was tacky, but in high school, I threaded all my cables through my jacket and under my shirt, so no one ever saw what it looked like.

Mid-way through high school, I started discovering smaller bands. My friends would find something through a friend of a friend and we would obsess over a small subset of no-name bands. I defined my identity on the odd bands I discovered and found instant rapport with people who shared the same interest.

My first flip phone had a 3.5 jack that let me listen to music directly off of it. The music was downloaded on to a mini-SD card that I would load from my computer. I could fit an entire Gb of songs on the phone. I could copy a CD onto my computer, convert WAV files into MP3s, and then pack eight or nine albums onto the SD card.


Nearing my end of my sophomore year, I fell in love with hyphy music and gangster rap. Mac-Dre, Andre Nickatina, Bay Area mix tapes, the Demolition Men. Reggae, ska, old-school hiphop. I went from listening to music on my headphones to blasting music from my car. I went back to burning traditional CDs so I could play them in my car stereo.


I got my first iPod when I was a junior. I went from burning CDs to using the iPod with the magical FM transmitter. The FM transmitter let you plug in the device into an iPod and stream music through a designated FM radio channel. This device was perfect if you were driving through a country road, but became problematic in crowded urban areas. The more popular the FM transmitter became, the more likely you were to pick up other people’s iPod signal. The harder it was to depend on the FM transmitter, the more likely I was to use an old burned CD. Listening to something sitting in my car or pick a radio station became my backup.

I continued fluctuating between mediums for listening to music as I got older.


Now that I look at music, its more unconscious than ever. Rather than using streaming services, like Spotify or iTunes, I depend largely on blog aggregators like Hype Machine. I trust more the social signals and use the corresponding filter as a means to determine what I do and don’t listen to. I’m more disconnected now from the source of my music. I only listen to a single artist when a song really stands out and I am drawn to find more.

There is a lesson to be taken about the changing modes of my accessing content.

Filed Under: Uncategorized Tagged With: Music, Music Discovery, Technology

January 14, 2017 by rememberlenny

Scoring in-person audience engagement through video footage analysis

Imagining how to provide speakers and performers with insightful feedback

Originally posted on Medium here.

Speaking to a large audience

Speaking to a large group is stressful. Something you can leave thinking things went horribly, when you couldn’t be more wrong. If you’ve spoken in front of a large group before, then you know what if feels like to not know what your audience thinks. While delivering your message, you can become so focused on one person, that you misinterpret the group as a whole. The one person texting or falling asleep can influence how you read the audience.

You could completely misunderstand the audience.

Image what it would look like to be able to record and reflect on your audience, based on what you say.

Quantifying the audience

First: Take a video camera or cell phone and plop it facing your audience. Have the camera film your audience during your segment of speaking or performing.

Second: Take the recorded footage and drop it into some non-existent, but feasible, software that analyzes each audience member’s face and body over time. The software could designate each person a score for their emotional state and level of engagement. Based on the scoring, timelines of scores could be generated for each person and the audience as a whole.

Third: Summaries from the scores can be turned into recommendations or insights for the speaker/performers to take into consideration. Based on these recommendations, a feedback based improvement cycle can be developed for speakers/performers, and a quantifiable score could be attributed to each audience.

All of the software to implement this is readily available.

Review

Imagine knowing how many audience members stayed for the entire time you spoke? Imagine knowing when people were most engaged or most disengaged. Reviewing footage is one thing, but being able to compare your results to previous engagements would be priceless.

Who would care

If it existed as a service, large venues could offer the insights as a service. Comedians could use the results to help refine their craft. Corporate speakers and salesmen, who travel on behalf of their company, could provide quantifiable results based on their group sessions.

Thoughts, concerns, or comments?

This is an idea. I’d love to know what you think.

Originally posted on Medium here.

/2017/01/14/scoring-in-person-audience-engagement-through/

Filed Under: Uncategorized

January 14, 2017 by rememberlenny

Scoring in-person audience engagement through video footage analysis

Imagining how to provide speakers and performers with insightful feedback

Originally posted on Medium here.

Speaking to a large audience

Speaking to a large group is stressful. Something you can leave thinking things went horribly, when you couldn’t be more wrong. If you’ve spoken in front of a large group before, then you know what if feels like to not know what your audience thinks. While delivering your message, you can become so focused on one person, that you misinterpret the group as a whole. The one person texting or falling asleep can influence how you read the audience.

You could completely misunderstand the audience.

Image what it would look like to be able to record and reflect on your audience, based on what you say.

Quantifying the audience

First: Take a video camera or cell phone and plop it facing your audience. Have the camera film your audience during your segment of speaking or performing.

Second: Take the recorded footage and drop it into some non-existent, but feasible, software that analyzes each audience member’s face and body over time. The software could designate each person a score for their emotional state and level of engagement. Based on the scoring, timelines of scores could be generated for each person and the audience as a whole.

Third: Summaries from the scores can be turned into recommendations or insights for the speaker/performers to take into consideration. Based on these recommendations, a feedback based improvement cycle can be developed for speakers/performers, and a quantifiable score could be attributed to each audience.

All of the software to implement this is readily available.

Review

Imagine knowing how many audience members stayed for the entire time you spoke? Imagine knowing when people were most engaged or most disengaged. Reviewing footage is one thing, but being able to compare your results to previous engagements would be priceless.

Who would care

If it existed as a service, large venues could offer the insights as a service. Comedians could use the results to help refine their craft. Corporate speakers and salesmen, who travel on behalf of their company, could provide quantifiable results based on their group sessions.

Thoughts, concerns, or comments?

This is an idea. I’d love to know what you think.

Originally posted on Medium here.

/2017/01/14/scoring-in-person-audience-engagement-through-2/

Filed Under: Uncategorized

Scoring in-person audience engagement through video footage analysis

January 14, 2017 by rememberlenny

Large groups are hard to read. By using software, the engagement of large groups can be quantified for futureĀ review.

Imagining how to provide speakers and performers with insightful feedback

Speaking to a largeĀ audience

Speaking to a large group is stressful. Something you can leave thinking things went horribly, when you couldn’t be more wrong. If you’ve spoken in front of a large group before, then you know what if feels like to not know what your audience thinks. While delivering your message, you can become so focused on one person, that you misinterpret the group as a whole. The one person texting or falling asleep can influence how you read the audience.

You could completely misunderstand the audience.

Image what it would look like to be able to record and reflect on your audience, based on what you say.

All is takes is a cell phone facing the audience.

Quantifying theĀ audience

First: Take a video camera or cell phone and plop it facing your audience. Have the camera film your audience during your segment of speaking or performing.

Second: Take the recorded footage and drop it into some non-existent, but feasible, software that analyzes each audience member’s face and body over time. The software could designate each person a score for their emotional state and level of engagement. Based on the scoring, timelines of scores could be generated for each person and the audience as a whole.

A software dashboard could break out each member of the audience.

Third: Summaries from the scores can be turned into recommendations or insights for the speaker/performers to take into consideration. Based on these recommendations, a feedback based improvement cycle can be developed for speakers/performers, and a quantifiable score could be attributed to each audience.

All of the software to implement this is readily available.

Review

Imagine knowing how many audience members stayed for the entire time you spoke? Imagine knowing when people were most engaged or most disengaged. Reviewing footage is one thing, but being able to compare your results to previous engagements would be priceless.


Who wouldĀ care

If it existed as a service, large venues could offer the insights as a service. Comedians could use the results to help refine their craft. Corporate speakers and salesmen, who travel on behalf of their company, could provide quantifiable results based on their group sessions.


Thoughts, concerns, or comments?

This is an idea. I’d love to know what you think.

Filed Under: Uncategorized Tagged With: Communication, Computer Vision, Enterprise Technology, Machine Learning, Public Speaking

Idea: Slack/Github activity monitoring

November 2, 2016 by rememberlenny

Idea: Slack/Github activity monitoring

Working for a distributed company has made me aware of an issue: employees can ā€œdisappearā€ too easily and no one notices. When employees have issues, there is no predetermined face-time to nip it in the bud. HR visits, employee gatherings, and good ol’ one-on-ones are easier to avoid. Distributed teams naturally focus attention where work is getting done and miss potential areas where employee satisfaction is not ideal.

Poor employee engagement could be pegged to a metric based on the interactions a person has per week.
I think you can solve for this without being creepy.
Its plausible that a basic observation of how users interact with one another in a public venue is a high signal for employee engagement. Using a Slack based company for example, if users are never active in any Slack channels, then they are most likely not participating in group discussions. This could be the result of many things, but for one, it could be a individuals sense of no longer feeling engaged by their coworker community.

Employee engagement viewed over time.
It is important to note that a person’s engagement could be interpreted many ways. A highly participatory user could be feel very disengaged. A non-participatory user could be very active in syncing up with channel activity, and therefore feel engaged. Based on these examples, public participation is not foolproof.
The public participation could be a strong metric when comparing a user against themselves, over time. For example, when a user begins at a company, their activity may be above normal, due to the high number of introductory engagements. As time goes on, an employee may shift into a participatory trend that resembles more alongside their overall behavior. Based on the corrected average, any significant veering away would be a event worth noting.

Commit history mapped against a user’s commit trend could surface potential employee-project connections.
The same analysis could be run on an engineering centric organization with a shared code repository. Whether Github or some locally Mercurial equivalent, the individual contributor trends could be interpreted overtime for employee-project match. For example, potential fit can be gauged based on an employees rate of contribution during certain projects.
This can be creepy from a management and HR standpoint.

User activity could signify user happiness.
To make this work, observed results should never be used for consequential results. Punishment or reward based on analysis would be counterproductive. Employees who believe their overall behavior is being monitored and tracked could feel untrusted. This form of analysis could be the cause for employees to actually feel dissatisfied.
The overall goal of this would be to catch the otherwise uncaught signals. The goal is not to monitor your employees. If something major changes, some structure should be in place to see that the a struggling employee doesn’t simply fade away due to no longer being in the lime light.

Filed Under: Uncategorized Tagged With: Distributed Teams, Employee Engagement, Work Culture

Computer Vision in commercial establishments

October 24, 2016 by rememberlenny

Security camera to scene analysis and summary insight report.Computer Vision in commercial establishments

Since the inception of the ā€œquantified selfā€ and corresponding hardware devices, I have been curious about what people are doing with the data they collect about themselves. Fitbits, Apple Watches, and step tracking tools provide individuals with insight about their physical activity. With the increase in data generated about individuals, I tend to wonder how this information translates to groups.

Hardware companies have written interesting studies analyzing the data their platforms and devices collect, but this information is restricted to the company’s chosen points of interest. Smaller establishments, such as restaurants or commercial stores, do not have the benefit of gaining insight from the increase in these data points.

I think there is an interesting way to think about the other sensors that are newly available for analysis, specifically, the sensors that have always been around, but have not been fully utilized.

Tracking trends in public spaces is now available to all

In years prior, sonar, lasers, and similar one-use sensor technologies have been required for computing the physical world with data. Tracking the traffic trends in public spaces or tracking the foot traffic in commercial establishments have previously required specialized hardware. The specialized hardware was available to cities and large commercial ventures, but again, not for smaller scale commercial ventures.

Security cameras in restaurants, cafes, and bars could be used to reconstruct scenes.Software advancements now make common use digital cameras into complex sensors. Using software that processes images, the content of an image can be quickly analyzed for recognizable objects and human behavior. Computer-vision based software can be used to reconstruct scenes using networks of cameras. This allows for common security camera infrastructure to be used for real-life google analytics like analysis.

Data gained from analyzing customers could be used to gauge customer loyalty.Human activity in a common space, monitored with a digital camera, can be used to interpret common movement into analyzable data points. Further, in commercial establishments, transaction data can extend these reference points to understand how a brick-and-mortar business operates. I assume this information would not provide revolutionary insights for a business owner, but would be a great reference over time for change in business activity. It’s reasonable to assume this data, over time, would help to foresee trends in business growth or decline.

In the immediate, this kind of monitoring and analysis is uncomfortable. I’m curious to see if our expectations and cultural norms will change with time. Already, our activity online is analyzed and used as a commercial product. The click-trails we leave behind are aggregated and used to better target ads. The search behavior, communication patterns, or social graphs we create offer greater insight into the kind of person we are than most of us would like to admit. This monitoring of behavior is something people have become both more sensitive to and less aware of.

Given the changing times, I wonder if brick-and-mortar businesses would be comfortable using their existing infrastructure to gain deeper insights in to their business. Further, I wonder how customers would feel knowing that their activity is being monitored and analyzed for the benefit of a business.

This technology is already being widely used in public places. Companies like Sidewalk Labs that offer free wifi using publicly placed kiosks outfitted with high quality cameras, which are undoubtedly being used to monitor public activity. I’m curious if this kind of monitoring and analysis would have required a different approach using older hardware solutions in the past.

Also posted on Medium

Filed Under: Uncategorized Tagged With: Big Data, Computer Science, Computer Vision, Hardware, Image Recognition

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Go to page 5
  • Go to page 6
  • Interim pages omitted …
  • Go to page 82
  • Go to Next Page »

Primary Sidebar

Recent Posts

  • Thoughts on my 33rd birthday
  • Second order effects of companies as content creators
  • Text rendering stuff most people might not know
  • Why is video editing so horrible today?
  • Making the variable fonts Figma plugin (part 1 – what is variable fonts [simple])

Archives

  • August 2022
  • February 2021
  • October 2020
  • September 2020
  • August 2020
  • December 2019
  • March 2019
  • February 2019
  • November 2018
  • October 2018
  • April 2018
  • January 2018
  • December 2017
  • October 2017
  • July 2017
  • February 2017
  • January 2017
  • November 2016
  • October 2016
  • August 2016
  • May 2016
  • March 2016
  • November 2015
  • October 2015
  • September 2015
  • July 2015
  • June 2015
  • May 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • October 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012

Tags

  • 10 year reflection (1)
  • 100 posts (2)
  • 2013 (1)
  • academia (2)
  • Advertising (3)
  • aging (1)
  • Agriculture (1)
  • analytics (3)
  • anarchy (1)
  • anonymous (1)
  • api (1)
  • arizona (1)
  • Art (2)
  • art history (1)
  • artfound (1)
  • Artificial Intelligence (2)
  • balance (1)
  • banksy (1)
  • beacon (1)
  • Beacons (1)
  • beast mode crew (2)
  • becausewilliamshatner (1)
  • Big Data (1)
  • Birthday (1)
  • browsers (1)
  • buddhism (1)
  • bundling and unbundling (1)
  • china (1)
  • coding (1)
  • coffeeshoptalk (1)
  • colonialism (1)
  • Communication (1)
  • community development (1)
  • Computer Science (1)
  • Computer Vision (6)
  • crowdsourcing (1)
  • cyber security (1)
  • data migration (1)
  • Deep Learning (1)
  • design (1)
  • designreflection (1)
  • Developer (1)
  • Digital Humanities (2)
  • disruption theory (1)
  • Distributed Teams (1)
  • drawingwhiletalking (16)
  • education (3)
  • Email Marketing (3)
  • email newsletter (1)
  • Employee Engagement (1)
  • employment (2)
  • Engineering (1)
  • Enterprise Technology (1)
  • essay (1)
  • Ethics (1)
  • experiement (1)
  • fidgetio (38)
  • figma (2)
  • film (1)
  • film industry (1)
  • fingerpainting (8)
  • first 1000 users (1)
  • fonts (1)
  • forms of communication (1)
  • frontend framework (1)
  • fundraising (1)
  • Future Of Journalism (3)
  • future of media (1)
  • Future Of Technology (2)
  • Future Technology (1)
  • game development (2)
  • Geospatial (1)
  • ghostio (1)
  • github (2)
  • global collaboration (1)
  • god damn (1)
  • google analytics (1)
  • google docs (1)
  • Graffiti (23)
  • graffitifound (1)
  • graffpass (1)
  • growth hacking (1)
  • h1b visa (1)
  • hackathon (1)
  • hacking (1)
  • hacking reddit (2)
  • Hardware (1)
  • hiroshima (1)
  • homework (1)
  • human api (1)
  • I hate the term growth hacking (1)
  • ie6 (1)
  • ifttt (4)
  • Image Recognition (1)
  • immigration (1)
  • instagram (1)
  • Instagram Marketing (1)
  • internet media (1)
  • internet of things (1)
  • intimacy (1)
  • IoT (1)
  • iteration (1)
  • jason shen (1)
  • jobs (2)
  • jrart (1)
  • kickstart (1)
  • king robbo (1)
  • labor market (1)
  • Leonard Bogdonoff (1)
  • Literacy (1)
  • location (1)
  • Longform (2)
  • looking back (1)
  • los angeles (1)
  • Machine Learning (13)
  • MadeWithPaper (106)
  • making games (1)
  • management (1)
  • maps (2)
  • marketing (4)
  • Marketing Strategies (1)
  • Media (3)
  • medium (1)
  • mentor (1)
  • message (1)
  • mindmeld games (1)
  • Mobile (1)
  • Music (2)
  • Music Discovery (1)
  • neuroscience (2)
  • new yorker (1)
  • Newspapers (3)
  • nomad (1)
  • notfootball (2)
  • npaf (1)
  • odesk (1)
  • orbital (14)
  • orbital 2014 (14)
  • orbital class 1 (9)
  • orbitalnyc (1)
  • paf (2)
  • paid retweets (1)
  • painting (1)
  • physical web (1)
  • pitching (2)
  • popular (1)
  • post production (1)
  • Privacy (1)
  • process (1)
  • product (1)
  • Product Development (2)
  • product market fit (2)
  • Programming (6)
  • project reflection (1)
  • promotion (1)
  • prototype (17)
  • prototyping (1)
  • Public Art (1)
  • Public Speaking (1)
  • PublicArtFound (15)
  • Publishing (3)
  • Python (1)
  • quora (1)
  • Rails (1)
  • React (1)
  • React Native (1)
  • real design (1)
  • recent projects (1)
  • reddit (3)
  • redesign (1)
  • reflection (2)
  • rememberlenny (1)
  • Remote work (1)
  • replatform (1)
  • Responsive Emails (1)
  • retweet (1)
  • revenue model (1)
  • rick webb (1)
  • robert putnam (1)
  • ror (1)
  • rubyonrails (1)
  • segmenting audience (1)
  • Semanticweb (2)
  • Senior meets junior (1)
  • SGI (1)
  • Side Project (1)
  • sketching (22)
  • social capital (1)
  • social media followers (2)
  • social media manipulation (1)
  • social media marketing (1)
  • social reach (5)
  • software (3)
  • Soka Education (1)
  • Spatial Analysis (2)
  • spotify (1)
  • stanford (2)
  • Startup (21)
  • startups (7)
  • stree (1)
  • Street Art (4)
  • streetart (5)
  • stylometrics (1)
  • Technology (1)
  • thoughts (1)
  • Time as an asset in mobile development (1)
  • Towards Data Science (4)
  • TrainIdeation (42)
  • travel (1)
  • traveling (1)
  • tumblr milestone (2)
  • twitter (1)
  • twitter account (2)
  • typography (2)
  • unreal engine (1)
  • user behavior (1)
  • user experience (3)
  • user research (1)
  • user testing (1)
  • variable fonts (1)
  • video editing (2)
  • visual effects (1)
  • warishell (1)
  • Web Development (8)
  • webdec (1)
  • webdev (13)
  • windowed launch (1)
  • wordpress (1)
  • Work Culture (1)
  • workinprogress (1)
  • zoom (1)