Fragments 2019

Fragments 2019

State of mobile engineering, state of platforms, hardware and user research.

What you can expect from the Bangalore edition of Fragments on 30 March:

Fragments was launched in response to the fragmented nature of engineering and software development for mobile. Much changed for mobile engineering in 2017 when React Native entered the landscape and cross-platform mobile development took off in a big way.

At the recently concluded ReactFoo, we talked not only about the ecosystem emerging with ReactFoo, but also how organizations are structuring teams for Android, iOS and cross-platform mobile development.

Coming back to Fragments, the Android versus iOS debate seems to be settled with Android winning the turf. Now, with Flutter taking off as a platform for mobile engineering, the battleground has opened between React Native versus Flutter. Who will win is not only a matter of adoption and user base metrics, but also which platform has a stronger community around it.
On 30 March, speakers Priyanka Sabhagani and Ajin Asokan will share BookMyShow’s and Zerodha’s experiences (respectively) with React Native and Flutter, helping participants evaluate each platform’s strength and weaknesses. We look forward to deeper discussions around Flutter, where the platform has piqued a great deal of interest from developers (who have to write less code), but greater skepticism about Flutter’s capabilities with respect to data storage and related issues from senior developers.

Apart from Flutter, the Bangalore edition of Fragments will feature talks on Kotlin and native app development. An interesting question to discuss here is the decision to go native versus when not to go native with your app. Which factors inform such a decision?
The other discussion which Raghunath Jawahar, Varsha Saha and Abhinav Rastogi will take up is what native app developers can learn from the mature web front-end architectures. The discussion will steer around the following topics:

  1. The dynamic nature of JavaScript versus static languages like Java/Kotlin/Swift which is used to develop native mobile apps.
  2. Expectations and user experience on web versus mobile.
  3. The asynchronous nature of mobile platforms and their lifecycles – the unique challenges this factor presents.

Finally, Fragments Bangalore will showcase talks on:

  1. Image uploads and Contraint and Motion layout.
  2. Optimizing the size of your mobile app.
  3. Voice, and its role in building augmented experiences for mobile apps.
  4. How the concept of Seams can be applied for building flexible and testable apps.
  5. Building predictable and high performance workflows for mobile app engineering.

Who should participate in the Bangalore edition of Fragments Conference?

  • Mobile engineers working on Android and cross-platform apps.
  • Senior developers and tech leads.
  • Backend developers and fullstack engineers.
  • Product managers.
  • Product engineers

Event details:

Date: 30 March 2019
Time: 9:00 AM to 5:30 PM
Venue: TERI auditorium, Domlur, Bangalore

Contact:

For more details, call us on 7676332020 or write to us on info@hasgeek.com

Future editions of Fragments:

Fragments will be held in Kochi and Hyderabad in 2019. Dates will be announced in April. If you wish to speak at any of the future editions of FragmentsConf, submit a proposal here: https://hasgeek.com/fragments/fragments-round-the-year-proposals-2019/

Hosted by

How do you make a great mobile experience? Explore with Fragments. Follow Fragments on Twitter more

Harshit Dwivedi

@the-dagger

Firebase ML Kit : Machine Learning made easy

Submitted May 24, 2018

At I/O 2018, Google released the Firebase ML Kit which creates various exciting opportunities for Android Developers aiming to build smart apps without having to worry about the nitty-gritties of Machine Learning.

The Firebase ML Kit APIs offer features like face detection, text recognition, object detection, etc.
Your apps can also label a provided image for special characteristics and identify popular landmarks in a picture.

In this talk, I will outline the usage of all the 5 APIs available in Firebase ML Kit
and I’ll be doing so by using a sample app that utilizes these APIs.

I will be walking you through the working of each api and you will leave the talk having sufficient knowledge of the APIs to go ahead and implement them in your own apps.

Outline

The talk is a tech talk which will outline the usage of all the 5 APIs available in Firebase ML Kit.
I’ll be using a sample app that I’ve created and will be walking the participants through the working of each api.

I’ll also be talking about how they can upload a custom model to firebase and use that instead of using the preloaded models.

After the talk, the attendees will have a good overview of these newly introduced apis and they will have enough knowledge to go ahead and implement them in their apps.

  1. Who should attend this tutorial?
    Android Developers who are excited by the concept of Machine Learning but don’t have a good idea on how to start with it.
    The talk will start with covering some basic APIs and then move on to how you can train a custom model of your own and use it to perform inferencing in your own app.

  2. Why attend?
    You’ll get a good overview of how you can start experiementing with Machine Learning by combining it with Android Development.
    This workshop will also serve as a motivator for you to get started with Machine Learning.

  3. Duration of the tutorial.
    2.5 hours to 3 hours

  4. Prior requirements in terms of knowledge; software installation and equipment that a participant needs to carry for this tutorial.
    Basics of Android Development, basic Kotlin, Android Stduio, Python3, Tensorflow (optional)

Some blogs I’ve written on the same topic :
https://medium.com/coding-blocks/google-lens-firebase-54d34d7e1505

https://medium.com/coding-blocks/creating-a-credit-card-scanner-using-firebase-mlkit-5345140f6a5c

https://medium.com/coding-blocks/creating-a-qr-code-reader-using-firebase-mlkit-60bb882f95f9

https://medium.com/coding-blocks/identifying-places-in-a-provided-image-using-firebase-mlkit-fe3c918756da

https://heartbeat.fritz.ai/building-pok%C3%A9dex-in-android-using-tensorflow-lite-and-firebase-cc780848395

https://heartbeat.fritz.ai/embracing-machine-learning-as-a-mobile-developer-4ebcda58d4ac

Github repo for the code covered in the blogposts :
https://github.com/the-dagger/MLKitAndroid

Requirements

Basic Android development / knowledge of basic Kotlin.

Speaker bio

Android Developer and an avid tech blogger, Harshit is passionate about anything and everything related to Android.
He is one of the first Google certified Android developers in India and being an Open Source enthusiast, he’s also a part of various programs like Google Summer of Code and Google Code In as a Mentor.

Recently he has been inclined towards his new found love that combines his knowledge of Mobile Development with Machine Learning to create smart mobile apps.
Harshit is working with Udacity and Coding Blocks, a startup in New Delhi focusing on creating more employable talent.

Links

Slides

https://docs.google.com/presentation/d/1ibjMFCZkqbdN6Arxbt8HxKLOOAK2RahIDCyOMCuBJQI/edit?usp=sharing

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

How do you make a great mobile experience? Explore with Fragments. Follow Fragments on Twitter more