Do you need a Blockchain?

As some of you may know I have been very interested in Bitcoin and blockchains for the past couple of years and have been digging into it even more lately as part of my day job. However, blockchains have been in the news a lot lately to the point where it often feels like its a solution looking for a problem. Sure, blockchains are the sexy new tech but here are some questions you need to ask when debating looking into blockchains as a solution to your problem:

  • Can you deal with transparency? Writing to a public blockchain is may be a difficult new way of working (as banks are starting to find out). There is also a huge mindset today that considers data an asset that could be profitable and not worth sharing (though with the increasing risk of hackers and data breaches, its worth considering if your big data is an asset or a liability?)
  • Do you need immutable history? A huge value of the blockchain is in the fact that the blockchain is a record of history that can never be modified. Certain initiatives around blockchains with super users that can rewrite history completely negate the core trust mechanism of the blockchain. However there may be data that is incorrectly reported at times. Does your data consist of absolute truths or may need to be corrected once in a while?
  • Will you have untrusted peers or the network? One of the big reasons to adopt a blockchain is in environments of limited trust. If the only people that can write to a blockchain need to be authorized in a different way, you are just trading your current problem for a new one.
  • Can you keep your blockchain from being hijacked? If you do have untrusted peers on a network, and your network is fairly small, can a peer take over your blockchain with a 51% attack?
  • Can you just leverage the Bitcoin blockchain? Keeping an operational blockchain is work and needs nodes and miners that are incentivized to maintain it. What kind of rewards do you offer that make your chain worth working on? If you don’t have the right incentives in place, your blockchain may not outlive the individual or corporate curiosity that spawned the idea. A better idea may be to write it or, more likely, a hash of it to the Bitcoin blockchain making it a part of the official history and something that will exist even if your project dies.

There is a huge tendency to fixate on the distributed nature of the blockchain, but if that is your sole goal, there are other distributed database technologies that you should be looking into.

 

Do you need a Blockchain?

Thoughts from Oculus Connect 3

For the last 3 days I have been at San Jose at the third official Oculus Connect conference (OC3) and its been amazing to see some of the prototypes, talk to a few developers and just learn from other folks who are charting the VR space. Its been an amazing mix of education and fun and given me a lot of mull over on the flight back tomorrow. Here are some highlights though:

The Social Focus: Putting people at the center

Social was definitely the big message at OC3, and the Facebook Social concept that Mark Zuckerberg demoed was really well done.

I really liked the avatars and the way they worked here. What was a little weird was it was at the same stage where a few minutes later they talked about the new Oculus Avatars system which as developers we are encouraged to use. These 2 projects are totally independent of each other and a later talk by Facebook’s Mike Booth talked about a lot of learnings that they developed while building Facebook’s avatar system that flies in the face of the look of the official Avatars app/sdk. Hopefully these two will me merged at some time in the future but there is enough of a chance that they may not.

avatars
The Oculus avatars (top) have a completely different aesthetic than the Facebook Social ones (bottom)

 

Oh and I hope you didn’t liked the boxy avatars from Oculus Social app, cause that effort seems to be dead.

Oculus Touch

Oculus finally revealed the pricing and availability of the Touch controllers. At $200 they are a little pricey and make a full Rift setup be almost the exact same price as the Vive. That aside the controllers are really nice and bringing your hands into VR does up the level of immersion in VR tremendously (at work we had used the Leap Motion sensor on top of the DK2 to get some hand tracking in a demo but thankfully we can leave those kludges behind). The only unfortunate thing with the Touch controllers being optional purchase is that developers can’t really rely on them being available which might prevent them from leveraging them to avoid splitting the market. Hopefully most Rift owners choose to get it cause I will say they work really well.

touch.png

Room Scale

Oculus can now do Room-scale VR but requires a third sensor that you can now buy for $79. I can’t imagine a lot of people going for this at least immediately and Room scale might remain the domain of the Vive for now.

Video

One of the messages Oculus apparently wanted to send was that passive experiences shouldn’t be dismissed as apparently usage of Oculus is equally divided between games and video apps. At the keynote, Oculus also announced a Video SDK that will let video publishers create content but let Facebook host and distribute that content efficiently based on their research around optimized 360 video streaming (Foveated rendering etc). I need to dig more into this.

Misc items:

Other things also interesting included:

  • Facebook continuing to fund more VR development with another 250M fund for VR apps and games
  • Oculus is adding an Education category to their stores so expect more apps and games for that
  • Cheaper Oculus Ready certified PCs including a $500 one. Oh and Oculus Ready certified laptops for you developers on the go.
  • Oculus will cover Unreal Engine license fees for apps sold through the Oculus Store for up to the first $5 million in gross revenue.
  • Lots of effort being put into audio and ambisonic rendering. A new higher audio quality headphone for the Rift was also introduced and given away to the attendees.
  • Untethered stand-alone VR headset in the works in the lab.
  • Oculus will now support WebVR and introduced a VR flavor of their popular React JavaScript framework

Favorite Game:

Most of the OC3 event was about trying out the demos for different games coming soon. Lots of good ones to choose from but Eagle Flight was totally awesome.

Favorite moment: Talking to John Carmack

Okay so this was a totally nerd moment, but I have been a big fan of John Carmack for a long time and he was kinda like a geek hero of mine. Being able to talk to him for a bit was really amazing. I even captured a part of it on video (vertical cause I saved it from Periscope🙂 )

Thoughts from Oculus Connect 3

My Experience with UPenn’s Research Experience For Teachers program

For the last few weeks I have been involved with Penn’s Research Experience for Teachers program (RET). The 6 week NSF funded program paired 10 teachers from local high schools in Philadelphia with a research engineer at Penn’s General Robotics, Automation, Sensing & Perception (GRASP) laboratory at Penn and an industry mentor in an effort to expose these teachers to advanced engineering concepts that they could take back to their students.

I had the pleasure to help with Matthieu Lecce and Danelle Ross on an effort to use computer vision to detect the volume of transparent liquid in a glass container. The project built on previous effort by Matthieu and the research team on seeing glassware (more information on that research can be found here). Technically Philly did a great writeup on their site as well on the final day of the project.

ret
Learning some machine vision at Matthieu’s office at the GRASP lab

The program was really intense for the teachers and major props to Danelle for completing it and going through this intense learning experience. My own contribution was fairly limited as I spent only a couple of hours a week with the team answering how groups like Comcast Innovation Labs (where I work) investigate such technology and how machine vision is being used in Virtual Reality and Augmented Reality, a domain I am currently very involved in at the lab.

More than anything though, it was a great learning experience for me. I learnt a lot of new concepts on how machine vision is approached (my only real experience with raw machine vision previously had been some OpenCV experiments in Processing and some half complete Android projects with face detection). Learning the core concepts that go into machine vision and the current state of the art in that field was a great experience.

Sometimes I really miss grad school.

My Experience with UPenn’s Research Experience For Teachers program

Getting Started with Gear-VR Development

Screen Shot 2016-08-22 at 10.54.34 PM.png

For the last 8 months, I have been working on VR prototypes for Comcast, one of which we showed off recently at the Cable Show and have been talking about at different conferences (you can see the deck here).

We recently started adding some new faces to my team and so I figured it might be a good idea to put down a quick how-to on GearVR apps. A lot of it can be found at different links on the internet, but this might be useful to go from zero to a quick “Hello World” app using Unity.

1. Setting Up GearVR for Android build:

Since  VR apps for the GearVR are Android apps, the standard Android setup for development is required:

  1. Go to the Settings app on your Samsung phone and go to About Phone settings.
  2. Tap the “Build Number” item on the list repeatedly till you see a “You are now a developer” message
  3. Now in the main Settings list, you’ll see a Developer options menu item
  4. Tap on it and in the list that comes up, turn on `USB debugging`  toggle button.
  5. Connect your phone to your laptop and an alert will appear asking you if you’d like to allow the PC to debug your app. Tap yes.

USB-debug_auth

2. Create a Unity app

  1. Install the latest version of the Unity IDE including the Android plugin.
  2. Create a quick new project (just put a cube in front of the camera so that you have something to see when the app launches)
  3. From the Unity IDE toolbar, click on File -> Build Settings
  4. Unity probably defaulted you to a Mac/Windows app export. From the platform list, click on Android and then click on Switch Platform. This might take a couple of minutes as Unity converts the project.Screen Shot 2016-08-22 at 10.18.39 PM
  5. When done, tap on the Player Settings. In the settings that appear, tap on the Android tab, and in the Other Settings, select the Virtual Reality Supported option and enable the Oculus option.Screen Shot 2016-08-22 at 10.22.37 PM

3. Enable your Android device to run Oculus apps on device.

  1. Create a profile at developer.oculus.com 
  2. Connect your phone to your computer and use the adb devices command on the command line to detect the connected device. Copy the device id.
  3. Use the device id at the Oculus osig file generator. Paste the id in the form field and you’ll get an osig file.
  4. Save the osig file in your project in the Project > Plugins > Assets folder. This will allow Unity to package the file into the generated apk

4. Build and Run 

  1. Click the Build and Run item in the Unity toolbar. Unity will pack and deploy the app to your phone.
  2. You’ll probably get an alert saying “The app cannot be launched because there is no Launcher Activity in the app. But the app will be deployed to the device.
  3. Back on the phone, go to Settings > Application Manager and find an app called “Gear VR Service”. Select it.
  4. Tap Storage and then  Manage Storage
  5. Tap the VR Service Version label multiple times till you get a notification saying “You are now a developer”. Two other options will appear below: Developer Mode and Add Icon to app list Screen Shot 2016-08-22 at 10.44.42 PM
  6. Tap on the Add Icon to app list. This will add an icon at the phone’s app list (where you find all your other apps)
  7. Tap the icon from the apps list. The launching activity will list all your available VR apps. Find your app and tap on it. You’ll get a screen instructing you to insert the phone into the GearVR. Inserting it will launch your app 🙌

 

Getting Started with Gear-VR Development

Bitcoin Pow Wow 2 notes

Every couple of months I meet a few friends over lunch to geek out over the latest in the world of Bitcoin, Blockchains and Crypocurrencies in general. Just so that I dont forget them, here is a list of things we discussed today🙂

 

Bitcoin Pow Wow 2 notes

Bringing Cable TV to VR (Slides)

me.png

 

Yesterday Jack Zankowski (who leads the next gen UX at Comcast) and I gave a talk on the design and engineering challenges in building VR experiences for TV content at the WICT Tech It Out event at Villanova University. While there, we were also able to check out their pretty interesting VR cave as well.

The talk is based on a VR prototype we demoed recently at the Cable Show and the Code Conference. Personally its been a very educational experience. In a way, working with Unity is like working with Flash all over again, with similar challenges ( managing visual assets, code architecture, working in a team of varying skillsets from design to development). Hopefully I’ll do some more write-ups here on those challenges. But for now, the deck from the event is embedded below.

Thanks to the folks at WICT for having us.

 

 

Bringing Cable TV to VR (Slides)

DiggGraphr referenced in technical paper on news services using big data analytics

Quite a few years back, I got really interested in Treemaps. The whole project had started as an academic discussion between a friend and I on how hard a treemap would be to build (they seemed to be a pretty popular data visualization method back then, though I don’t see them around much these days). Anyway, what I thought would be trivial weekend hack turned out to be a lot more involved and I ended up reading and implementing Mark Bruls, Kees Huizing, and Jarke J. van Wijk’s algorithm for an optimum Squarified Treemap implementation (in ActionScript 3, you can find all the code which I open sourced here).

To demo the algorithm, I took the visual aesthetics of Marumushi’s NewsMap but instead used it to show trending topics on Digg.com. The project, not so creatively named DiggGraphr, got fairly popular and was mentioned on a few data visualization blogs and even won an award for a Digg.com API contest.

DiggGraphr has been dead for a while now, but I was pleasantly surprised to have it be included in a research paper titled ‘A case study on news services using big data analytics (뉴스빅데이터 서비스 사례 및 모델 개발 연구)’  conducted by professor Kim of the Korea Aerospace University and three researchers from a non-profit research organization (Media and Future Institute).

digggraphr.png

If anyone of you can read Korean, feel free to read the paper here

 

DiggGraphr referenced in technical paper on news services using big data analytics