Today I deployed a second site to a Firebase project. I have deployed sites individually on different Firebase projects but hadn’t realized that a single project could support multiple sites. This is specially useful if the various sites use the same assets (think internationalized versions of sites etc)
The documentation on multi-site support is actually pretty good. In my case, my “launchpage” project was completely different from the other site on the project but it does give me the opportunity to bring the two together later. It basically came down to modifying the firebase.json file to look like this:
This file tells the Firebase tools to ignore certain files and deploy others to hosting.
You can test your app locally by calling firebase serve and deploy to production by calling firebase deploy while in the project root.
The only hiccup I ran into is setting up DNS correctly. While Firebase tries to make it easy by giving you an IP address to point your domain to, Namecheap doesn’t work if you specify your domain name in the hosting panel and requires you to use @ to refer to the domain you are configuring. Subdomains similarly cannot be FQDNs and need to just be the name of the subdomain you are configuring (www instead of http://www.mysite.com for example)
Note that Firebase will occasionally request you to re-verify your domain. See the conditions why on this link or the screenshot below
Same rule applies for re-verification: Use the `@` key to add your custom TXT record needed to verify your domain
Considering how easy this was, this might be the way I host all my sites in the future 🙂
Most of the examples I have seen for PaletteGenerator use an in-app image that the system can immediately load but using remote images is more complicated since the library has to wait for enough of the image to load to read the color data. This gets further complicated if you need to run an animation while loading the image.
After trying a number of iterations, the best approach seems to be using Flutter’s precacheImage method before kicking off the animation
🙃 Can’t believe 2019 is over. Fun was had, life was lived. So let’s talk about it
Most of my work in 2019 was split between conversational technologies (bots and such), Flutter, some Machine Learning and finally some Blockchain stuff. So here is a quick recap of the year:
I spent a lot of time this year evaluating various technologies in the context of virtual conversational assistants. I still remain very passionately a believer in the chatbots space and even with the fervor around that space dying out with the whole “Bots are the new apps” idea not really happening.
As with a lot of domains of technology right now (VR, Blockchains, etc), the dying out of the initial mania is allowing some really interesting work proceed and evolve the space without a harsh spotlight and investors expecting 10x returns in 2 years.
The problems in that space (IMHO) right now really come down to the facts that:
Writing bot dialogue is hard and manually authored conversation trees can’t scale
Tools for authoring and previewing bot dialogues are poor
AI-based systems that can hold a true dynamic conversation aren’t really there yet and
There is very little exploration of the user-experience beyond text and animated gifs.
I still really believe that we will need virtual agents as proxies for ourselves and services we interact with as the digital world becomes more complex. It’ll be interesting to see if this space evolves or becomes the next IVR system that no-one loves
Speaking of user-experience, I played a lot with Flutter this year and have already written about it in a previous post. There are 3 reasons I like Flutter:
It’s a cross-platform tool that gives me a lot of control over the graphics (unlike, say, React-Native)
It’s pushing a culture of advanced UI’s that are simple to build which I felt kinda suffered when Flash died
The fact that Google commissioned GSkinner.com to create some amazing UIs that they gave away the code for others to use in their apps just underscores the kind of experiences they wish people would create with it. Here’s hoping Flutter gets more adopted in 2020
I finally got to work on some Machine Learning based projects this year which was interesting. While I wouldn’t call myself competent in that domain yet, I feel I could get there in 2020 (hopefully). I am also very interested in the emergence of higher-level tools that make working with ML even easier, like Uber’s Ludwig and tools like RunwayML.
One particular area of ML that I got into this year was Affective Computing. I am fascinated by the idea of empathetic systems (whether they use AI or not) and exploring the area of Affective Computing gave me a lot to think about. Some of that I even shared at a couple of conferences this year, including the PHLAI conference.
I wish I had done more with Blockchains this year, but my efforts in that space this year were mostly limited to managing the Comcast Blockchain Guild, attending the local Ethereum meetup and the Philly BlockchainTech meetup and trying to keep up with the torrent of news coming out of the dev community. My personal goal is to do a little more hands-on coding in that space again in 2020 🤞
I attended a few conferences this year which were very different from one another
Google IO was really inspirational with a lot of ideas to come back with. It is amazing to see how much Google has embraced AI and the kinds of experiences AI has enabled. I actually kept the Android sessions I attended this year to a minimum as I was getting a lot more interested in other spaces like AI, Flutter and Firebase. I was also very pleasantly surprised by the Chrome experiences on display at IO. Its amazing to see how far the web platform has come.
My favorite tech conference of the year had to be EyeO Festival. The conference explores the space at the intersection of art and technology and had some truly inspiring sessions with amazing speakers. You can check out my Twitter thread of some of the sessions I attended but I’d strongly encourage you to check out as many of the sessions as you can from Eyeo 2019 on Vimeo
I spoke at PHLAI on Affective AI. Had a lot of imposter-syndrome going on given that I was speaking at an AI/ML conference with some very high profile speakers
I was at a panel on Smart Contracts at Coinvention 2019 moderated by the amazing Thomas Jay Rush (of Quickblocks.io)
Attended the Blockchain and Other Networks conference by TTI Vanguard which was really interesting, especially with a format where every attendee could interrupt the speaker at any time if they had a question. Someone later recognized me there as “oh yeah, you are the one with all the questions” 😜
Firebase is pretty magic and for the most part delivers on the idea of an instant-on backend for your mobile or web-app. The problem with magic though is its hard to plan for if or when something does break. Take database backups for example. While traditional database backups are a known science, backing up Firebase’s storage (FireStore) setup is still a poorly documented / infrequently attempted effort.
I tried implementing that today and the official documentation only stressed me out. Thankfully, I found this article which got me most of the way there, though of course a few things had changed (api endpoints are now versioned v1 and the GoogleAuth library has a minor change on how it is initialized). Anyway, I am glad its finally done, but I wanted to share it here in case anyone else is looking for it.
While in Firebase, I also got a chance to play around with Firebase’s testing setup (my one cloud function in my project didn’t have a test so I implemented one). The local testing setup seems interesting but unfortunately I couldn’t get too far since FirebaseAuth isn’t supported yet, so for now my tests actually connect to a real Firebase account and run the test. Hopefully FirebaseAuth Triggers will be supported soon.
Finally I did try using Firebase Extensions for the first time and tied my Firebase Auth system to MailChimp. That worked flawlessly. I am really excited about the future of Firebase Extensions. Hopefully we’ll start seeing more functionality available as just a one-click addition to your project.
When I first heard of Flutter last year, I couldn’t help but draw parallels to Java Swing, the UI technology I started with in grad school (and thankfully dropped a year or so later). If you don’t know much about Java’s UI technologies, suffice to say that for all of Java’s strengths, no version of their UI framework was ever one of them.
It started okay-ish enough with Java’s AWT toolkit that let Java call native code to create system windows, buttons etc, but devs soon realized that building cross-platform applications (which was always Java’s pitch) was really hard when you could only target the least common denominator widgets that were available across all platforms. “No more” said the Java community, and proceeded to build Swing, a cross-platform UI framework that emulated the system controls by drawing them itself on a canvas.
Sound familiar? That was what Flutter promises with the core graphics engine that would emulate the native Android and iOS widgets
The problem is that Swing turned out to be crap. The widgets never felt native and performed poorly. You could always tell if you were using a Swing app. And it was always interesting when some app wasn’t coded right and you’d end up with apps emulating the Windows look-and-feel on a Mac (who checked on a Mac back in those days)
But then a couple of things happened. One: I saw some pretty compelling Flutter based apps. The interesting thing was, the best apps try to create their own design language anyway, so deviating from the system look-and-feel felt okay; and then second: I tried Flutter for a labweek project and was won over by the one click deploy to multiple platforms and the hot reload (it might also have been just fortuitous timing as I was losing my mind with React Native’s minimal support for custom views and animations, something that Flutter promises a lot more control over)
But the core reason I am excited about Flutter is the culture. The reason Swing was a dud (IMHO) was that it was built by people who didn’t care to push UI experiences. The native mobile toolkits are better but still make it hard to build complex user interfaces (SwiftUI and Jetpack Compose are trying to change that). Example “Hello World” apps you see using native toolkits are pretty generic form-based apps.
But look at the kinds of apps Flutter shows off:
While this may not be everyone’s cup of tea, this “think-outside-boxy-layouts” approach has my vote.
4 months in, I am still new-ish to Flutter but I guess I am on board. Stay tuned for more random Flutter stuff on this blog 🙂
For the last few years I have been thinking quite a bit about how we enable more people to learn programming. As an industry, we need more programmers universally and there seems to be a huge number of people who would want to come in. Unfortunately though we can’t seem to connect the 2 sides of this equation effectively.
Specifically I have been thinking about learning curves. Until recently I believed learning curves to follow a close-to-linear relationship with time. You learn a little bit at the beginning and are work on simple ideas and learn more and more as time goes by.
This seems to be codified in most programming books too, which introduce simple ideas at the beginning and then move towards more complex ideas
However, lately I feel a more honest representation of this learning curve we expect a newcomer to master would look something like this.
The initial hump in that graph represents a mountain of complexity that junior programmers are immediately handed before they can do anything with code. A lot of times this hump represents meta-work: things that are not core to the technology but elements like build systems or frameworks for testability, coding standards, etc
Same goes for mobile app development. For example if you are looking to make your first Android app, a brand new Android project using the Android Studio wizard drops you into a mess of Gradle, Java, Kotlin and XML files.
Tools like XCode and Android Studio also are extremely complicated for any beginner to use, with a ton of panels and tools to tap on without knowing what they do. Ironically, most of the teams building these tools have User Experience professionals on them and yet the ideas of progressive disclosure and first run experience, that as an industry we keep touting for our end user apps, are never considered.
Technology Complexity Cycle
Reflecting on my own learning-programming experience and talking about this with a few other people, I realize that another thing that got me into programming was also working on a technology (Flash) that wasn’t as mature.
When I started playing with Flash, it was back in the Flash-4 days with a very simple programming model where most of the code was written in small scripts attached to the timeline that just controlled the position of the animation playhead. My learning-to-code experience happened almost in-sync with the addition of complexity to Flash. Towards the last part of my experience with Flash, it had gotten complex enough with ActionScript3 and the need to become a “real” programming platform that it started to lose people.
I feel this happens a lot. Early versions of a programming platform are simple and functional and then, if it gathers attention of the “serious” programmers, way too much complexity gets added. This complexity makes the technology a daunting beast to new entrants.
The point is …
I had a couple of thoughts for new programmers that became the primary motivation for this post:
Survive the initial hump: Getting started with learning programming is a lot harder in the beginning so stick with it. It does get easier as you cross the initial hump of tools and meta-work that goes into starting a project and very rarely revisited once the project is in active development
Play with emerging technologies: Emerging technologies don’t often have a lot of initial hump as tooling and other meta-work hasn’t been invented yet. Technologies like WebVR, Blockchains, Flutter etc are great candidates to play with now and grow your skills as the technology matures.
And for those of us who have been in this industry for a while and may have the power to influence tooling and/or methodologies of how code is written, lets endeavor to make these more welcoming to folks with different levels of experience with tech.
I have been looking into this space for the last 4 months or so and it is a really fascinating space. I am a big believer in the need for assistive systems to mediate an increasingly complex world for us and that these assistants would need to be not only competent but also empathetic. However as with most new technologies, Affective Computing has a lot of potential for abuse as well if these technologies are used to emotionally manipulate or deceive people. It is definitely a path we must tread very carefully.
In general I really did enjoy the PHLAI conference and met some really interesting people and saw some very impressive technical talks and demos by IBM, Tableau , H2O, SaS and Ascend. Thank you to the organizers for putting on a great event. Definitely looking forward to the next one!
It’s always nice to be recognized. And since part of the purpose of this blog is to serve as a historical record of my professional life, figured I should post about it here before I lost the link to the sands of time 😉
And we have a good one lined up with some of the biggest tech leaders in Philly on a panel on managing your career as a technologist. If you are a developer or are looking to become one, you should definitely sign up.
For me it’s certainly a time for some celebration and reflection. Corey and I started Philly GDG, or rather its previous incarnation, AndroidPhilly, in 2011 when both of us had just about started working on Android and realized there wasn’t a local community where we could learn from each other. And considering how minimal technical documentation and user experience guidelines were back then, a local community was sorely needed. The group transitioned to an official GDG at some point which meant we got a lot more support from Google in terms of speakers and schwag.
Thinking back, there are a lot of things that worked well. The consistency of the day (last Wed of every month) and location (Comcast Center) every month definitely was a good idea and built up a monthly habit for the regular members. Comcast was great about sponsoring this event every month since it’s inception, and my managers, former and current, were very supportive of letting me run this. Other companies in Philly have been fantastic supporters as well including Promptworks, Chariot, Candidate and others who have hosted or supported us with food and beverages over time.
We are also a better balanced community as far as gender goes with more women participation than a lot of other communities. A lot of credit there goes to Corey for leading the outreach in early days, and always making sure we had women as part of the leads. It’s something the current leads, Yash, John and Ruthie, continue to champion.
There have also always been a lot of challenges, some similar to those faced by other groups while others unique to our own. Sourcing speakers every month is hard, specially when your community is much smaller than those in cities like SF and NY. Creating a channel for the community to keep the conversation going has also been challenging with Slack becoming a defacto communities platform that doesn’t really work if you aren’t paying for it (I am starting to look at other platforms like Discord, but a lot of people may not be willing to install another app). Trying to balance the level of talks has also been a concern: we want to have intro level talks to bring new people in but also more advanced sessions for folks who have been coming here for a while. If you have ideas on any of these, I am all ears.
I made a lot of friends thanks to our group. From other past (Corey, Chuck, Dallas) and present (Yash, John, Ruthie and Kenny) fellow organizers who helped run this group to regular members who have been attending our monthly meetup for years.
I am looking forward to how the group evolves going forward. In the meanwhile, if you are in the neighborhood, join us for our 🎉100th event. It promises to be a great one
This really old post still does a great job of bringing us up to speed to the Unicode world we live in today. And then came Emojis
There are numerous posts of the pain of dealing with Emojis whenever you have to because it does screwy things like combining neighboring characters to form a single emoji. This means that the length of a string, if it is just a measure of the unicode CodePoints used is different from what you would count on the screen.
This gives you whacky results like “💩”.length == 2 and generally makes working with strings just a pain even to the extent of crashing your iPhone. On the flip side some things like being able to delete family members from the 4 member family emoji with every backspace are kinda amusing, since it is it’s actually 7 characters: 4 ‘normal’ people characters and 3 invisible ‘joining’ characters in between.
Except that the characters in the list also included emojis and Dart’s Uri class doesn’t work with anything more than UTF-8 characters and crashes when encountering strings with emojis that are just ‘escaped’. This, as it turns out, is as per spec and all those fancy emoji domains that I thought used Unicode in the URI, use a different idea around Internationalized Resource Identifiers and Punycode. Thankfully passing in a URI encoded string with emojis seems to work fine and emojis come out 👍on the other side of the decode process.
While this seemed to work at that point, passing the decoded string to my Yaml loader crashes the app again (is Yaml supposed to be restricted to Ascii/Utf-8 ? ). But that is a problem for a different day.
For now, I have decided to just convert emojis to shortcodes for the transit and remap them to emojis on the other side. Its not pretty but it works.
Oh and in the meanwhile, if you want to know how to loop through a String with emojis in Dart, you can do that by looking through the Runes in a String:
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters