WTH is Injection Token in Angular


Angular is a very advanced and thought-out framework where Angular Components, Directives, Services, and Routes are just the tip of the iceberg. So my intention with WTH-series of posts is to understand something that I do not know yet in Angular (and its brethren) and teach others as well.

Let us start with a simple example, an Angular Injectable Service that we all know and use extensively in any Angular project. This particular example is also very common in Angular projects where we often maintain environment-specific configurations.

// environment.service.ts
import { Injectable } from '@angular/core';

@Injectable({providedIn: 'root'})
export class Environment1 {
    production: boolean = false;
}

Here, we annotated a standard ES6 class with @Injectable decorator and forced Angular to provide it in the application root, making it a singleton service i.e. a single instance will be shared across the application. Then, we can use a Typescript constructor shorthand syntax to inject the above service in Component’s constructor as follows.

// app.component.ts
import { Component } from '@angular/core';

import { Environment1 } from './environment.service'; 

@Component({
    selector: 'my-app',
    templateUrl: './app.component.html'
})
export class AppComponent  {
    constructor(private environment1: Environment1) {}
}

But, often such environment-specific configurations are in the form of POJOs and not the ES6 class.

// environment.service.ts
export interface Environment {
    production: boolean;
}
export const Environment2: Environment = {
    production: false
};

So the Typescript constructor shorthand syntax will not be useful in this case. However, we can naively just store the POJO in a class property and use it in the template.

// app.component.ts
import { Component } from '@angular/core';

import { Environment, Environment2 } from './environment.service'; 

@Component({
    selector: 'my-app'
    templateUrl: './app.component.html'
})
export class AppComponent  {
    environment2: Environment = Environment2;
}

This will work, no doubt!? But, this defeats the whole purpose of Dependency Injection (DI) in Angular which helps us to mock the dependencies seamlessly while testing.

InjectionToken

That’s why Angular provides a mechanism to create an injection token for POJOs to make them injectable.

Creating InjectionToken

Creating an Injection Token is pretty straight-forward. First, describe your injection token and then set the scope with providedIn (just like the Injectable service that we saw earlier) followed by a factory function that will be evaluated upon injecting the generated token in the component.

Here, we are creating an injection token ENVIRONMENT for the Environment2 POJO.

// injection.tokens.ts
import { InjectionToken } from '@angular/core';

import { Environment, Environment2 } from './environment.service'; 

export const ENVIRONMENT = new InjectionToken<Environment>(
  'environment',
  {
    providedIn: 'root',
    factory: () => Environment2
  }
);

Feel free to remove providedIn in case you do not want a singleton instance of the token.

Injecting InjectionToken

Now that we have the Injection Token available, all we need is inject it in our component. For that, we can use @Inject() decorator which simply injects the token from the currently active injectors.

// app.component.ts
import { Component, Inject } from '@angular/core';

import { Environment } from './environment.service'; 
import { ENVIRONMENT } from './injection.tokens';

@Component({
  selector: 'my-app'
  templateUrl: './app.component.html'
})
export class AppComponent  {
  constructor(@Inject(ENVIRONMENT) private environment2: Environment) {}
}

Additionally, you can also provide the Injection Token in @NgModule and get rid of the providedIn and the factory function while creating a new InjectionToken, if that suits you.

import { NgModule } from '@angular/core';
import { BrowserModule } from '@angular/platform-browser';

import { AppComponent } from './app.component';
import { Environment2 } from './environment.service'; 
import { ENVIRONMENT } from './injection.tokens';

@NgModule({
    imports:      [ BrowserModule ],
    declarations: [ AppComponent ],
    bootstrap:    [ AppComponent ],
    providers: [{
        provide: ENVIRONMENT,
        useValue: Environment2
    }]
})
export class AppModule { }

Demo

https://stackblitz.com/edit/angular-2fqggh

That’s it for all. Keep Rocking \m/ \0/

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Enable tree-shaking in Rails/Webpacker: A Sequel


A month ago, I wrote a blog post explaining a hacky way to enable tree-shaking in Rails/Webpacker project at Simpl. I would definitely recommend skimming through the previous post if you have not already.

In this post, we will directly jump into a more robust and stable solution. But before that, I want to resurrect my old memories for you that haunted me for months wherein a broken manifest.json was generated during webpack compilation at a random place. This time, after upgrading @rails/webpacker and related webpack plugins, the problem has been escalated beyond repair wherein an incomplete but valid manifest.json was generated randomly having fewer pack entries than expected. So even the generated manifest.json has little chance of succor by the hacky NodeJS fix_manifest.js script I had written to fix the broken JSON last time.

After a bit of googling my way out, I learned that webpack, with multi-compiler configurations, compiles each webpack configuration asynchronously and disorderly. Which is why I was getting an invalid manifest.json earlier.

Imagine two webpack compilations running simultaneously and writing to the same manifest.json at the same time:

{
  "b.js": "/packs/b-b8a5b1d3c0c842052d48.js",
  "b.js.map": "/packs/b-b8a5b1d3c0c842052d48.js.map"
}  "a.js": "/packs/a-a3ea1bc1eb2b3544520a.js",
  "a.js.map": "/packs/a-a3ea1bc1eb2b3544520a.js.map"
}

Using different manifest file for each pack

Yes, this is the robust and stable solution I came up with. First, you have to override Manifest fileName in every webpack configuration in order to generate a separate Manifest file for each pack such as manifest-0.json, manifest-1.json, and so on. Then, use the same NodeJS script fix_manifest.js with a slight modification to concatenate all the generated files into a final manifest.json which will be accurate (having all the desired entries) and valid (JSON).

For that, we have to modify the existing generateMultiWebpackConfig method (in ./config/webpack/environment.js) in order to remove the existing clutter of disabling/enabling writeToEmit flag in Manifest which we no longer need. Instead, we will create a deep copy of the original webpack configuration and override the Manifest plugin opts for each entry. The deep copying is mandatory so that a unique Manifest fileName can endure for each pack file.

const { environment } = require('@rails/webpacker')
const cloneDeep = require('lodash.clonedeep')

environment.generateMultiWebpackConfig = function(env) {
  let webpackConfig = env.toWebpackConfig()
  // extract entries to map later in order to generate separate 
  // webpack configuration for each entry.
  // P.S. extremely important step for tree-shaking
  let entries = Object.keys(webpackConfig.entry)

  // Finally, map over extracted entries to generate a deep copy of
  // Webpack configuration for each entry to override Manifest fileName
  return entries.map((entryName, i) => {
    let deepClonedConfig = cloneDeep(webpackConfig)
    deepClonedConfig.plugins.forEach((plugin, j) => {
      // A check for Manifest Plugin
      if (plugin.opts && plugin.opts.fileName) {
        deepClonedConfig.plugins[j].opts.fileName = `manifest-${i}.json`
      }
    })
    return Object.assign(
      {},
      deepClonedConfig,
      { entry: { [entryName] : webpackConfig.entry[entryName] } }
    )
  })
}

Finally, we will update the ./config/webpack/fix_manifest.js NodeJS script to concatenate all the generated Manifest files into a single manifest.json file.

const fs = require('fs')

let manifestJSON = {}
fs.readdirSync('./public/packs/')
  .filter((fileName) => fileName.indexOf('manifest-') === 0)
  .forEach(fileName => {
    manifestJSON = Object.assign(
      manifestJSON,
      JSON.parse(fs.readFileSync(`./public/packs/${fileName}`, 'utf8'))
    )
})

fs.writeFileSync('./public/packs/manifest.json', JSON.stringify(manifestJSON))

Wrap up

Please note that the compilation of a huge number of JS/TS entries takes a lot of time and CPU, hence it is recommended to use this approach only in a Production environment. Additionally, set max_old_space_size to handle the out-of-memory issue for production compilation as per your need – using 8000MB i.e. 8GB in here.

$ node --max_old_space_size=8000 node_modules/.bin/webpack --config config/webpack/production.js
$ node config/webpack/fix_manifest.js

Always run those commands one after the other to generate fit and fine manifest.json 😙

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

RTFC


I really could not think of a better title for this post because it is not just about using an @Input property setter instead of the life-cycle hook ngAfterViewInit. Hence the title is pretty much inspired from RTFM where “Manual” is replaced by “Code”.

It’s about how important it is to read the code.

Just read the code..!

Last month I had published the Angular blog post on NgConf Medium in which I had proposed various ways to use jQuery plugins in Angular. If you have not read it yet, do read it here and comment if any. Unfortunately, I did not get lucky enough to be ngChampions (Kudos to those who become) and hence I have decided to publish the sequel here on my personal blog.

So after publishing the first post, I went on reading the source code for Material Badge component, just casually.

And to my surprise, I noticed 3 profound things:

Structural Directive over Component

It depends on the functionality you want to build into the component. If all you want to do is alter a single DOM then always go for a custom structural directive instead of writing a custom component. Because the custom component mostly introduces its own APIs unnecessarily.

For example, take a look at the app-toolbar-legends component from the last article. Remember, I’m not contradicting myself in this article, however, for this particular jQuery plugin in Angular, we could safely create an Angular Directive rather than having the Angular Component with its own API in terms of class and icon attributes below.

<app-toolbar-legends class="btn-toolbar-success" icon="fa-bitcoin" [toolbarConfig]="{position: 'right'}">
  <div class="toolbar-icons hidden">
    <a href="#"><i class="fa fa-bitcoin"></i></a>
    <a href="#"><i class="fa fa-eur"></i></a>
    <a href="#"><i class="fa fa-cny"></i></a>
  </div>
</app-toolbar-legends>
<app-toolbar-legends class="btn-toolbar-dark" icon="fa-apple" [toolbarConfig]="{position: 'right', style: 'primary', animation: 'flip'}">
  <div class="toolbar-icons hidden">
    <a href="#"><i class="fa fa-android"></i></a>
    <a href="#"><i class="fa fa-apple"></i></a>
    <a href="#"><i class="fa fa-twitter"></i></a>
  </div>
</app-toolbar-legends>

That means we can simplify the usage of the jQuery plugin in Angular by slapping the Angular Directive on the existing markup as follows. There is no need for an extraneous understanding of where class or icon values go in the component template, it’s pretty clear and concise in here. Easy, just slap a directive appToolbarLegends along with the jQuery plugin configurations.

<div class="btn-toolbar btn-toolbar-success" [appToolbarLegends]="{position: 'right'}" >
  <i class="fa fa-bitcoin"></i>
</div>
<div class="toolbar-icons hidden">
  <a href="#"><i class="fa fa-bitcoin"></i></a>
  <a href="#"><i class="fa fa-eur"></i></a>
  <a href="#"><i class="fa fa-cny"></i></a>
</div>


<div class="btn-toolbar btn-toolbar-dark" [appToolbarLegends]="{position: 'right', style: 'primary', animation: 'flip'}">
  <i class="fa fa-apple"></i>
</div>
<div class="toolbar-icons hidden">
  <a href="#"><i class="fa fa-android"></i></a>
  <a href="#"><i class="fa fa-apple"></i></a>
  <a href="#"><i class="fa fa-twitter"></i></a>
</div>

Generate Unique Id for the DOM

I wanted a unique id attribute for each instance of the toolbar in order to map them to their respective toolbar buttons. I’m still laughing at myself for going above and beyond just to generate a unique ID with 0 dependencies. Finally, StackOverflow came to the rescue 😅

Math.random().toString(36).substr(2, 9)

But while reading the source code for Material Badge component, I found an elegant approach that I wish to frame on the wall someday 😂. This will generate a unique _contentId for each instance of the directive without much fuss.

import { Directive } from '@angular/core';
let nextId = 0;
@Directive({
  selector: '[appToolbarLegends]'
})
export class LegendsDirective {
  private _contentId: string = `toolbar_${nextId++}`;
}

@Input property setter vs ngAfterViewInit

Before we get into the getter/setter, let’s understand when and why to use ngAfterViewInit. It’s fairly easy to understand — it is a life cycle hook that triggers when the View of the component or the directive attached to is initialized after all of its bindings are evaluated. That means if you are not concerned with querying the DOM or DOM attributes which have interpolation bindings on them, you can simply use Class Setter method as a substitute.

import { Directive, Input } from '@angular/core';
let nextId = 0;
@Directive({
  selector: '[appToolbarLegends]'
})
export class LegendsDirective {
  private _contentId: string = `toolbar_${nextId++}`;
  @Input('appToolbarLegends')
  set config(toolbarConfig: object) {
    console.log(toolbarConfig); // logs {position: "right"} object
  }
}

The Class Setters are called way before ngAfterViewInit or ngOnInit and hence they speed up the directive instantiation, slightly. Also, unlike ngAfterViewInit or ngOnInit , the Class Setters are called every time the new value is about to be set, giving us the benefit of destroying/recreating the plugin with new configurations.

Demo Day

Thanks for coming this far. So the moral of the story is to do read code written by others, does not matter which open source project it is.

Just read the code..!

https://stackblitz.com/edit/angular-zgi4er?embed=1&file=src/app/app.component.ts
If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

TypeScript: Generocks


It’s been a while that this blog post about Typescript Generics was in my drafts (not sure why) and finally it is out. 🤞But before we get into Generics, we should be aware of what makes Types in Typescript exciting.

Typescript allows us to type check the code so that errors can be caught during compile time, instead of run time. For example, the following code in Javascript may look correct at compile time but will throw an error in a browser or NodeJS environment.

const life = 42;
life = 24;  // OK at compile time

In this case, Typescript may infer the type of the variable life based on its value 42 and notify about the error in the editor/terminal. Additionally, you can specify the correct type explicitly if needed:

const life: number = 42;
life = 24; // Throws an error at compile time

Named Types

So there are few primitive types in Typescript such as number, string, object, array, boolean, any, and void (apart from undefined, null, never, and recently unknown). However, these are not enough, especially when we use them together in a large project and we might need a sort of an umbrella type or a custom type for them to hold together and be reusable. Such aliases are called named types which can be used to create custom types. They are classes, interfaces, enums, and type aliases.

For example, we can create a custom type, MyType comprising a few primitive types as follows.

interface MyType {
  foo: string;
  bar: number;
  baz: boolean;
}

But what if we want foo to be either string or object or array!? One way is to copy the existing interface to MyType2 (and so on).

interface MyType2 {
  foo: Array<string>;
  bar: number;
  baz: boolean;
}

The way we pass any random value to a function with the help of a function parameter to make the function reusable, what if we allowed to do the same for MyType as well. With this approach, the duplication of code will not be needed while handling the same set but different types of data.  But before we dive into it, let us first understand the problem with more clarity. And to understand it, we can write a cache function to cache some random string values.

function cache(key: string, value: string): string {
  (<any>window).cacheList = (<any>window).cacheList || {};
  (<any>window).cacheList[key] = value;
  return value;
}

Because of the strict type checking, we are forcing the parameter value to be of type String. But what if someone wants to use a numeric value? I’ll give you a hint: what operator in Javascript do we use for a fallback value if the expected value of a variable is Falsy? Correct! we use || a.k.a. logical OR operator. Now imagine for a second that you are the creator of Typescript Language (Sorry, Anders Hejlsberg) and willing to resolve this issue for all developers. So you might go for a similar solution to have a fallback type and after countless hours of brainstorming, end up using a bitwise operator i.e. | in this case (FYI, that thing is called union types in alternate dimensions where Anders Hejlsberg is still the creator of Typescript).

function cache(key: string, value: string | number): string | number {
  (<any>window).cacheList = (<any>window).cacheList || {};
  (<any>window).cacheList[key] = value;
  return value;
}

Is not that amazing? Wait, but what if someone wants to cache boolean values or arrays/objects of custom types!? Since the list is never ending, looks like our current solution is not scalable at all. Would not it be great to control these types from outside!? I mean, how about we allow to define placeholder types inside the above implementation and provide real types from the call site instead.

Generic Function

Let us use ValueType (or use any other placeholder or simply T to suit your needs) as a placeholder wherever needed.

function cache<ValueType>(key: string, value: ValueType): ValueType {
  (<any>window).cacheList = (<any>window).cacheList || {};
  (<any>window).cacheList[key] = value;
  return value;
}

We can even pass the custom type parameter MyType to the cache method in order to type check the value for correctness (try changing bar‘s value to be non-numeric and see for yourself).

cache<MyType>("bar", { foo: "foo", bar: 42, baz: true });

This mechanism of parameterizing types is called Generics. This, in fact, is a generic function.

Generic Classes

Similar to the generic function, we can also create generic classes using the same syntax. Here we have created a wrapper class CacheManager to hold the previously defined cache method and the global (<any>window).cacheList variable as a private property cacheList.

class CacheManager<ValueType> {
  private cacheList: { [key: string]: ValueType } = {};
  cache(key: string, value: ValueType): ValueType {
    this.cacheList[key] = value;
    return value;
  }
}
new CacheManager<MyType>().cache("bar", { foo: "bar", bar: 42, baz: true });

Even though the above code is perfect to encourage reusability of CacheManager while caching all types of values, but someday in future, there will be a need to provide varying types of data in MyType‘s properties as well. That exposed us to the original problem of MyType vs MyType2 (from the Named Types section above). To prevent us from duplicating the custom type MyType to accommodate varying types of properties, Typescript allows using generic types, even with interfaces, which makes them Generic Interfaces. In fact, we are not restricted to use only one parameter type below, use as many as needed. Additionally, we can have union types as a fallback to the provided parameter types. This permits us to pass an empty {} object while using the generic interface wherever we desire the default types of values.

interface MyType<FooType, BarType, BazType> {
  foo: FooType | string;
  bar: BarType | number;
  baz: BazType | boolean;
}
new CacheManager<MyType<{},{},{}>>().cache("bar", { foo: "bar", bar: 42, baz: true });
new CacheManager<MyType<number, string, Array<number>>>().cache("bar", { foo: 42, bar: "bar", baz: [0, 1] })

I know that it looks a bit odd to pass {} wherever the default parameter types are used. However, this has been resolved in Typescript 2.3 by allowing us to provide default type arguments so that passing {} in the parameter types will be optional.

Generic Constraints

When we implicitly or explicitly use a type for anything such as the key parameter of the cache method above, we are purposefully constraining the type of key to be of type String. But we are inadvertently allowing any sort of string here. What if we want to constraint the type of key to be the type of one of the interface properties of MyType only!? One naive way to do so is to use String Literal Types for key below and in return get [ts] Argument of type '"qux"' is not assignable to parameter of type '"foo" | "bar" | "baz"'error when the key type qux does not belong to MyType. This works correctly, however, because of the hard-coded string literal types, we can not replace MyType with YourType interface since it has different properties associated with it as follows.

interface YourType {
  qux: boolean;
}
class CacheManager<ValueType> {
  private cacheList: { [key: string]: ValueType } = {};
  cache(key: "foo" | "bar" | "baz", value: ValueType): ValueType {
    this.cacheList[key] = value;
    return value;
  }
}
new CacheManager<MyType<{},{},{}>>().cache("foo", { foo: "bar", bar: 42, baz: true }); // works
new CacheManager<MyType<{},{},{}>>().cache("qux", { foo: "bar", bar: 42, baz: true }); // !works
new CacheManager<YourType>().cache("qux", { qux: true }); // !works but should have worked

To make it work, we’ve to manually update the string literal types used for key each time in the class implementation. However, similar to Classes, Typescript also allows us to extend Generic Types. So that we can get away with the string literal types "foo" | "bar" | "baz" of key with the generic constraint i.e. KeyType extends keyof ValueType as follows. Here, we are forcing the type-checker to validate the type of key to be one of the interface properties provided in CacheManager. This way we can even serve previously mentioned YourType without any error.

class CacheManager<ValueType> {
  private cacheList: { [key: string]: ValueType } = {};
  cache<KeyType extends keyof ValueType>(key: KeyType, value: ValueType): ValueType {
    this.cacheList[key] = value;
    return value;
  }
}
new CacheManager<MyType<{},{},{}>>().cache("foo", { foo: "bar", bar: 42, baz: true }); // works
new CacheManager<MyType<{},{},{}>>().cache("qux", { foo: "bar", bar: 42, baz: true }); // !works
new CacheManager<YourType>().cache("qux", { qux: true }); // works

Alright, that brings us to the climax of Generics. I hope \0/ Generocks  \m/ for you.

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Fantastic Beasts: The Crimes of Pythonworld


When I decided to make my foray into the Pythonic World, I stumbled upon the sorcery between system level Python2.7 and Python3. What are pip and pip3? I used to install some Python packages using pip and if not worked, used pip3. Either of them always worked 🙅‍♂️. I did not know what I was doing apart from just getting the system up and running until I determined to see how deep the rabbit hole goes. But the rabbit hole was not that deep, it was my confused mind that made it deep until now…

pip vs pip3

As you have already guessed that Python3 is a predecessor of Python2. In order to maintain the backward compatibility of the package manager pip for Python2, Python3 came up with its own package manager under the name of pip3. However, we can point python and pip commands directly to python3 and pip3 executables respectively (which we will see in the later sections), so that we do not have to deal with python3 or pip3 commands while running a python script or installing any python package.

The upshot is that pip by default points to the system level Python 2.7 and pip3 points to whatever version we have for Python3.

⇒  pip --version
pip 19.0 from (python 2.7)
⇒  pip3 --version
pip 18.1 from (python 3.7)

To naively create an alias for python and pip commands to point to python3 and pip3, we can add following in our bash/zsh file and reload the shell to take its effect.

alias python=python3
alias pip=pip3
# BEFORE
⇒  python --version
Python 2.7.15
⇒  pip --version
pip 19.0 from (python 2.7)

# AFTER
source ~/.zshrc
⇒  python --version
Python 3.7.1
⇒  pip --version
pip 18.1 from (python 3.7)

This approach works, however, we constantly have to edit the bash/zsh file to switch between two or more versions of Python. Clearly, we can do better.

Introducing pyenv

Pyenv allows us to install and switch between multiple versions of Python.

pyenv versions

We can check what versions of Python are installed on our system with the following command. The * in the beginning represents the current Python version (System Python 2.7 in this case) the python and pip commands point to.

⇒  pyenv versions
* system
  3.6.3

pyenv install <version>

We can install any version of Python using the following install command. Note that the installation does not switch to the installed python version implicitly.

⇒  pyenv install 3.7.2
⇒  pyenv versions
* system
  3.6.3
  3.7.2

pyenv shell <version>

To manually switch to any Python version (only in the current shell), we can use this particular command. That means, killing the shell window would restore the Python version to the system level one. Here we have switched to Python 3.7.2 in the current shell.

⇒  pyenv shell 3.7.2
⇒  pyenv versions
  system
  3.6.3
* 3.7.2

Introducing pyenv-virtualenv

Now that we have fixed the problem of maintaining different versions of Python to be used in various Python Projects. The different but somewhat similar problem persists for Python packages too.

For example, imagine we have two Python projects running on top of Python 3.7.2 but using different versions of Django, 2.1.5 (latest) and 1.9. So installing both one after the other using pip install Django==2.1.5 and pip install Django==1.9 commands would override the 2.1.5 version with the 1.9 one. Hence, both projects inadvertently would end up using the same Django version which we do not want. That’s where Python Virtual Environments help.

There are many Python packages out there to manage our virtual environments and some of them are virtualenv, virtualenvwrapper, etc. Although, either is better or worse than others in some way. However, we are going to use pyenv-virtualenv which is a pyenv plugin using virtualenv under the hood.

pyenv virtualenvs

Similar to pyenv versions, this command shows us a list of virtual environments we have on our system. Below I have one virtualenv venv already created for Python 3.6.3.

⇒  pyenv virtualenvs
  3.6.3/envs/venv
  venv

pyenv virtualenv <environment-name>

Let’s create a virtual environment for Python 3.7.2. Now we can see the two virtual environments created but none of them are activated yet.

⇒  pyenv virtualenv venv-3.7.2
⇒  pyenv virtualenvs
  3.6.3/envs/venv
  3.7.2/envs/venv-3.7.2
  venv 
  venv-3.7.2 

pyenv activate <environment-name>

Let’s activate the virtual environment venv-3.7.2. The * in the beginning represents the activated virtual environment where Django will be installed.

⇒  pyenv activate venv-3.7.2
⇒  pyenv virtualenvs
  3.6.3/envs/venv 
  3.7.2/envs/venv-3.7.2 
  venv 
* venv-3.7.2 

First, we can confirm if Django is installed in the activated virtual environment. If not, we will install Django 1.9.

# BEFORE
⇒  pip list --format=columns
Package    Version
---------- -------
pip        19.0.1
setuptools 28.8.0

# AFTER
⇒  pip install Django==1.9
⇒  pip list --format=columns
Package    Version
---------- -------
Django     1.9
pip        19.0.1
setuptools 28.8.0

So far so good. Now we must verify whether we got the isolation for packages using pyenv-virtualenv that we wanted.

pyenv deactivate

To check that we can deactivate the current virtual environment. This command will restore Python to system level one. And pip list will now show all the global Python packages installed on our system. Notice that Django is not installed anymore since we got out of the venv-3.7.2 virtual environment.

⇒  pyenv deactivate
⇒  pyenv virtualenvs
  3.6.3/envs/venv 
  3.7.2/envs/venv-3.7.2 
  venv 
  venv-3.7.2 
⇒  pip list --format=columns
Package    Version
---------- -------
pip        9.0.1
setuptools 28.8.0
airbrake               2.1.0
aniso8601              4.1.0
arrow                  0.10.0
asn1crypto             0.24.0
attrs                  18.2.0
bcrypt                 3.1.6
bitarray               0.8.3
boto                   2.49.0
boto3                  1.9.83
.
.
.

Wrap up

As of now, pyenv and pyenv-virtualenv are serving me well. I hope that things will be stable going forward too. 🤟

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Managing CRON with Ansible


Setting up a CRON job manually is a child’s play so why am I writing about it? The reason I’m writing about it because of two main reasons:

  1. My experience of setting it up with Ansible
  2. Common mistakes I made which others can avoid

CRON

CRON daemon is a long-running process that executes commands at specific dates and times. This makes it easy to schedule activities like sending bulk emails. The CRON relies on a crontab file that has a list of scheduled commands to run. So we can manually add/edit/remove scheduled commands directly in the crontab file but this may induce bugs, especially when the list is too long. Ansible helps in deploying such CRON jobs effortlessly.

Ansible

Ansible is an IT automation and orchestration engine. It uses YAML file syntax for us to write such automation called Plays and the file itself is referred to as a Playbook. Playbooks contain Plays. Plays contain Tasks. And Tasks run pre-installed modules sequentially and trigger optional handlers.

Similarly, I have used a CRON module in Ansible to set up a task which configures a CRON job as follows:

- name: Run CRON job every day at 11.05 PM UTC
  become: yes
  become_method: sudo
  cron:
    name: "apache_restart"
    user: "root"
    hour: 23
    minute: 5
    job: "/bin/sh /usr/sbin/apache_restart.sh"
    state: present

Imagine there is a fictional CRON job to run Apache2 at the specified time every day – god knows why? But I made unfair mistakes initially while setting it up. Let us go step by step into each of those mistakes:

become and become_method

These flags are only necessary while running the job with sudo or any other privilege escalation method. In this case, I wanted to run /bin/sh /usr/sbin/apache_restart.sh command with sudo and I wished not to expect a password prompt that we usually get while running such commands manually. So the become flag prevents the password prompt.

In the beginning, I had forgotten to add these flags preventing the CRON job from executing the bash apache_restart.sh file as expected.

cron module

Ansible lets us use the pre-installed CRON module so that it will be far easy to setup CRON jobs. Although, by mistake, I had made CRON module an Ansible task as mentioned below.

- cron:
    name: "apache_restart"
    user: "root"
    hour: 23
    minute: 5
    job: "/bin/sh /usr/sbin/apache_restart.sh"
    state: present

As we learned before, only Tasks can run pre-installed modules. So Ansible instantly threw an error while deploying and I managed to save my face 🤦‍♂️

cron name

I thought that since I had already named the task, naming the CRON module will not be necessary. But I was embarrassed more than wrong. Because each time you deploy any changes to Ansible, without a CRON name, it will set up a new CRON job leaving the previous one as is. So I was literally restarting Apache2 thrice at a time. Remember, the CRON name works as a unique key to identify if any CRON job is already set up with the same name. If not, it will set up a brand new CRON job in the crontab file. Otherwise, override the existing one with new configurations.

state

The default state of the CRON job is present. Although to disable a particular CRON job, you change the state to absent and redeploy it via Ansible. I was using the state present without the CRON name that was creating multiple crontab entries on each deployment.

job

The job key takes the actual command that you want to run at a specific time/date. But make to use absolute command paths for brevity.

Wrap up

I also use tail -f /var/log/syslog and grep CRON /var/log/syslog commands to check the logs to make sure that CRON actually runs the bash file I specified.

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Kung Fu “Pandas”


I have planned to write an article per week on Data Engineering from the perspective of a beginner – the first one was Python for Frontend Engineers. Do not expect a proven pathway to become a Data Engineer someday because I do not have any strategy at the moment. I am just following my gut feeling to become proficient sooner than later. So consider these blog posts as my personal notes which may or may not be helpful to others.

So, Pandas is a data processing tool which helps in data analysis – meaning it provides various functions/methods to manipulate large datasets efficiently.

I am still learning pandas and will continue to explore its new features. The Pandas documentation is pretty self-explanatory so I will just give a glimpse of its powers in this article just like the trailer.

This is how you can import pandas and start using it right away.

Importing Pandas

import pandas as pd

pd.* # where * denotes all the supported methods

Pandas supports two data structures at the moment.

Series

The Series data structure represents a one-dimensional array with labels i.e. Python Dictionary. However, the data in the dictionary can be of any primitive types supported by Python.

Creating Series

sons_of_pandu = {
  'son1': 'Yudhishthira',
  'son2': 'Bhima',
  'son3': 'Arjuna',
  'son4': 'Nakula',
  'son5': 'Sahadeva'
}
pandavas_series = pd.Series(sons_of_pandu)
print(pandavas_series)
# Prints following in Jupyter Notebook
# son1   Yudhishthira
# son2   Bhima
# son3   Arjuna
# son4   Nakula
# son5   Sahadeva
# dtype: object

Changing Indices of Series

Sometimes we prefer to change the indexing for brevity. So here we can change the index of the series to Pandavas’ progenitors.

pandavas_series.index = ["Yama", "Vayu", "Indra", "Ashwini Kumara Nasatya", "Ashwini Kumara Darsa"] # Prints following in Jupyter Notebook
# Yama                   Yudhishthira
# Vayu                   Bhima
# Indra                  Arjuna
# Ashwini Kumara Nasatya Nakula
# Ashwini Kumara Darsa   Sahadeva
# dtype: object

Slicing Series

Slicing is really handy when glancing at a large dataset. We can also slice the series for an exploratory view as follows.

pandavas_series[0:2] # Prints first and second rows excluding the third
pandavas_series[1:]  # Prints all rows except the first one
pandavas_series[-2:] # Prints the last two rows only

Appending Series

It is very common to deal with different data sets in Pandas and the append method is just a compliment you can not ignore.

kauravas = ["Duryodhan", "Dushasana", "Vikarna", "Yuyutsu", "Jalsandh", "Sam", "Sudushil", "Bheembal", "Subahu", "Sahishnu", "Yekkundi", "Durdhar", "Durmukh", "Bindoo", "Krup", "Chitra", "Durmad", "Dushchar", "Sattva", "Chitraksha", "Urnanabhi", "Chitrabahoo", "Sulochan", "Sushabh", "Chitravarma", "Asasen", "Mahabahu", "Samdukkha", "Mochan", "Sumami", "Vibasu", "Vikar", "Chitrasharasan", "Pramah", "Somvar", "Man", "Satyasandh", "Vivas", "Upchitra", "Chitrakuntal", "Bheembahu", "Sund", "Valaki", "Upyoddha", "Balavardha", "Durvighna", "Bheemkarmi", "Upanand", "Anasindhu", "Somkirti", "Kudpad", "Ashtabahu", "Ghor", "Roudrakarma", "Veerbahoo", "Kananaa", "Kudasi", "Deerghbahu", "Adityaketoo", "Pratham", "Prayaami", "Veeryanad", "Deerghtaal", "Vikatbahoo", "Drudhrath", "Durmashan", "Ugrashrava", "Ugra", "Amay", "Kudbheree", "Bheemrathee", "Avataap", "Nandak", "Upanandak", "Chalsandhi", "Broohak", "Suvaat", "Nagdit", "Vind", "Anuvind", "Arajeev", "Budhkshetra", "Droodhhasta", "Ugraheet", "Kavachee", "Kathkoond", "Aniket", "Kundi", "Durodhar", "Shathasta", "Shubhkarma", "Saprapta", "Dupranit", "Bahudhami", "Yuyutsoo", "Dhanurdhar", "Senanee", "Veer", "Pramathee", "Droodhsandhee", "Dushala"]
kauravas_series = pd.Series(kauravas)
pandavas_series.append(kauravas_series) # Prints following in Jupyter Notebook
# Yama                   Yudhishthira
# Vayu                   Bhima
# Indra                  Arjuna
# Ashwini Kumara Nasatya Nakula
# Ashwini Kumara Darsa   Sahadeva
# 0                      Duryodhan
# 1                      Dushasana
.
.
.
# Length: 106, dtype: object

Dropping from Series

Pass the index to drop any row from the series.

pandavas_series.drop('Yama') # Prints following in Jupyter Notebook
# Vayu                   Bhima
# Indra                  Arjuna
# Ashwini Kumara Nasatya Nakula
# Ashwini Kumara Darsa   Sahadeva
# 0                      Duryodhan
# 1                      Dushasana
.
.
.
# Length: 105, dtype: object

Dataframes

The Dataframe data structure represents a two-dimensional list with labels i.e. Python List.

Creating Dataframe

sons_of_pandu = [{
  'name': 'Yudhishthira',
  'progenitor': "Yama"
}, {
  'name': 'Bhima',
  'progenitor': "Vayu"
}, {
  'name': 'Arjuna',
  'progenitor': "Indra"
}, {
  'name': 'Nakula',
  'progenitor': "Ashwini Kumara Nasatya"
}, {
  'name': 'Sahadeva',
  'progenitor': "Ashwini Kumara Darsa"
}]
df_pandavas = pd.DataFrame(sons_of_pandu)

Head’ing DataFrame

df_pandavas.head()  # returns first 5 rows
df_pandavas.head(3) # returns first 3 rows

Tail’ing DataFrame

df_pandavas.tail()  # returns last 5 rows
df_pandavas.tail(3) # returns last 3 rows

Sorting DataFrame

df_pandavas.sort_values(by="name")

Slicing DataFrame

df_pandavas[0:2] # Prints first and second rows excluding the third
df_pandavas[1:]  # Prints all rows except the first one
df_pandavas[-2:] # Prints the last two rows only
df_pandavas[["name"]] # Prints all rows with "name" column only

Copying DataFrame

df_pandavas_in_alternate_dimension = df_pandavas.copy()

Wrap up

That’s it. There are more to Pandas than mere slicing/merging/copying/sorting. You can easily read/write CSV/XL files in Pandas like never before. Head over to Pandas Documentation for more information.

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Python for Frontend Engineers


Py thee on.

— William Shakespeare (meaning Python code turns you on) 😅

I tried to teach Python to myself for a week and now I’m able to read and understand Python code comfortably. This blog summarises my thought process while learning Python and comparing its syntax with Javascript at every step. I’ll keep updating it as and when I encounter new Pythonic stuff.

First and foremost, wear a gauntlet and snap a finger for semicolons, curly braces, and camelCase names to bite the dust.

console.log vs print

JavascriptPython
console.log(“Hello World!”);print(“Hello World!”)
console.log(“Hello ” + “World!”);print(“Hello ” + “World!”)
var greet = “Hello”;
var who = “World!”;
console.log(`${greet} ${who}`);
greet = “Hello”
who = “World!”
print(“{0} {1}”.format(greet, who))

Primitive Types

JavascriptPython
var name = “John Doe”;name = “John Doe”
var age = 100;age = 100
var isMale = true;is_male = True

Utility Methods

JavascriptPython
“amit”.toUppercase(); “amit”.upper() # “amit”.upper().isupper()
“amit”.length;len(“amit”)
“amit”.indexOf(“a”);“amit”.index(“a”) // throws error if not found 😦
String(100);str(100)
parseInt(100.300, 10);int(100.300)
parseFloat(100.300, 10);float(100.300)
Math.abs(-100);abs(-100)
Math.pow(3, 2);pow(3, 2)
Math.round(3.2);round(3.2)
Math.floor(3.2);floor(3.2)
Math.ceil(3.2);ceil(3.2)
Math.sqrt(25);sqrt(25)
NA (use Lodash)max(4, 2)
NA (use Lodash)min(4, 2)

User Input

JavascriptPython
var ans = prompt(“What’s up?”);ans = input(“What’s up?”)

Arrays vs Lists

JavascriptPython
var arrStr = [“a”, “b”, “c”];lst_str = [“a”, “b”, “c”]
arrStr[0]lst_str[0]
arrStr[-1] // returns undefinedlst_str[-1] # returns “c”
NAlst_str[1:] # returns [“b”, “c”]
NAlst_str[1:2] # returns [“b”] ignores 3rd
NAlst_str[:2] # returns [“a”, “b”] ignores 3rd
arrStr[1] = “d”;lst_str[1] = “d”
arrStr.push(“d”); lst_str.append(“d”)
arrStr.splice(1, 0, “f”);lst_str.insert(1, “f”)
arrStr.indexOf(“b”);lst_str.index(“b”)
NAlst_str.count(“d”) # returns 2 occurrences
arrStr = arrStr.sort();lst_str.sort()
arrStr.reverse();lst_str.reverse()
var arrNum = [1, 2, 3];lst_num = [1, 2, 3]
arrStr = arrStr.concat(arrNum);lst_str.extend(lst_num)
var arrStr2 = arrStr.slice();lst_str2 = lst_str.copy()

Tuple in Typescript vs Tuple in Python

JavascriptPython
var point = [“x”, “y”];point = (“x”, “y”)
point[1];point[1]
var arrTuple = [[“x”, “y”], [“w”, “z”]];list_tuple = [(“x”, “y”), (“w”, “z”)]

function vs def

JavascriptPython
function greetMe(name){
    console.log(
       "Hi! " + name
    );
}
def greet_me(name):
    print("Hi! " + name)
 
 
 


function greetMe(name){
    return "Hi! " + name;
}
def greet_me(name):
    return "Hi! " + name

 

If Statements

JavascriptPython
if ((1&&1)||(2 === 2)) {
  console.log("124");
} else if (1 !== 2) {
  console.log("421");
} else {
  console.log("Lakhan!");
}
if (1 and 1) or (2 == 2):
   print("124")
elif (1 != 2):
   print("421")
else:
  print("Lakhan!")
 

Objects vs Dictionaries

JavascriptPython
var objRomans = {
  "five": "V",
  "ten":"X"
}
dict_romans = {
  "five": "V",
  "ten":"X"
}
// Returns undefined
// if key not exists
objRomans["five"]
# throws an error
# if key not exists
dict_romans["five"]

Loops

JavascriptPython
var i = 0;
var crossedLimit = false;
while (i !== 10) {
  i += 1;
  crossedLimit = i==10;
  if (!crossedLimit) {
    console.log(i);
  }
}
i = 0
crossed_limit=False
while i != 10:
  i += 1
  crossed_limit = i == 10
  if not(crossed_limit):
    print(i)
 
 
var name = "Amit";
var l = name.length;
for (var i = 0;i<l;i++) {
  console.log(
    name.charAt(i)
  );
}
name = "Amit"
 
for char in name:
  print(char)
 
 
 
var names = ["A", "M"];
for (let name in names) {
  console.log(
    names[name]
  );
}
names = ["A", "M"]
for name in names:
  print(name)
 
 

Comments

JavascriptPython
// single line comment
# single line comment
/*
  multi-line comment
  can be very long
*/
"""
  multi-line comment
  can be very long
"""

Imports

JavascriptPython
// imports everything
// from cookies.js
import "./cookies";

// imports `a`
// from cookies.js
import a from "./cookies";
# imports everything
# from cookies.py
import cookies

# imports `a`
# from cookies.py
from cookies import a

Try Catch vs Try Except

JavascriptPython
try {
  let obj = JSON.parse("")
} catch(e) {
  console.log(e.message);
}
import json
try:
  dict = json.loads("")
except Exception as e:
  print(str(e))

Classes

JavascriptPython
class Bike {
  isBikeToo() {
    return true;
  }
}

class Car
extends Bike {
  constructor(
    tire
  ) {
    super();
    this.tire=tire;
  }

  whoAmI() {
    if(this.tire==3){
      return "Auto";
    } else {
      return "Bike";
    }
  }
}

var c=new Car(3);
console.log(
  c.tire,
  c.whoAmI(),
  c.isBikeToo()
);
class Bike:
  def is_bike_too(self):
    return True
 
 

class Car(Bike):
   
  def __init__(self, tire):
     
    
    
    self.tire = tire


  def who_am_i(self):
    if self.tire == 3:
      return "Auto"
    else:
      return "Bike"
 
 
 
 
c = Car(3)
print("{0} {1} {2}".format(
  str(c.tire),
  c.who_am_i(),
  str(c.is_bike_too())
))
If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Frontend Frontier – sci-fi short story


This is my 2nd science-fiction short story (the 1st one is here). It is a tribute to the Frontend Engineers and how their life and job would change in the future due to the advent of ML/AI.

Special thanks to Aamna, Roshan, and Abhishek for grammatical/punctuation improvements and the overall constructive feedback.

Please star or comment to provide your valuable feedback, whatever it may be. I’m immune to anything hurls at me 😅.

IN A VACANT BUT CHAOTIC CABIN, a soul, deep in reverie entered. The door plate had a caption that read, “Yesterday’s gadgets were in front of you, Today’s gadgets are in your hands, and Tomorrow’s gadgets will be right within you”. That was the most influential gift his grandpa had given him at the age when he knew the only use of a tongue in his mouth was to tease others. Now, he was in his late twenties and worked part-time as an A.I specialist for many years. Dare not to gaze in awe since everybody was an A.I. specialist these days, mostly due to high demand and high wages in return compared to other jobs or sarcastically no jobs. He recalled the “There is an APP for that!” story often shared by his grandpa to justify the current situation. Because of the rampant use of Machine Learning tools and techniques, making something smarter was a piece of cake now. Off-course, you’ve to dig deeper to improve the existing Machine Learning models or invent your own for an unprecedented result.
Dave was an autodidact who inhabited his working space. The room was furnished with the aesthetically minimalistic design which gave the impression of everything fairy. One side of the room, the door facing wall was painted pinkish blue, only to warp it nicely or rather gel with other drabbed walls. Silvery photo frame, containing meaningless photos in baker’s dozen, was planted on one of the drabbed walls in a shape of Kalpataru tree. It was connected to the Line (was called Internet before) as pictures that were framed-in changed randomly or perhaps as per the mood and emotions emanated by the people around. It turned out that even the walls changed colors and patterns by injunction.
There was no source of open vents, but still, it was quite frigid in there for anybody to retreat on chaise longue—cemented far away at the corner to rest for eternity.
Another wall, facing east had big nestled sliding windows for sunlight and thereby for warmth to slip through. As usual, the Sun had shone on the sliding windows from outside like a stage lighting. Looking down with half-shut eyes through the glass confirmed that the cabin was a part of 100-story construction erected from quite above the ground. Might be lodged on top of some flat structure below but could not be seen from up above. Even the surroundings were occupied with similar but sufficiently spaced out constructions.
Adjacent to the sliding windows, the stacked circular knobs controlled the artificial light—sunlight to be precise supposing the sun was not around—emerging from the glass to populate the room since no working hours were pre-defined.

Most of the things were wall mounted but the only exception was Dave’s workstation. Long gone were days when people in the past profoundly designated themselves as software engineers. In those days, their work and identity extended beyond mere programming software or so-called coding: analyzing project requirements, brainstorming with a team, creating prototypes, beta testing with handful of early adopters, fixing bugs, churning out new features, and many more such activities had made engineering the nail-biter job in a good way. But things had changed drastically and with unexpected swiftness since then.
Now they were called experts.
When people stopped doing traditional searches on various search engines back then, the obvious progression for many of those search engines was to become Digital Personal Assistants. Many could not fathom the idea and some died due to competition and so to speak complexity in giving accurate answers. Today, they collaborate or ironically fraternize themselves with Intelligent Digital Personal Assistant (IDPA) for any software related work as if humans and machines had switched roles. Even the schism between frontend and backend had become illusory.
Nowadays these experts had been trusting IDPAs more than their instincts when it came to problem-solving. By the same token, this probably might have made the software engineer term obsolete in this era.

When Dave entered the room, IDPA wished him welcome, “Good Morning, Dave!” and reminded in a soothing female voice, “I’m stoked to work alongside you on a new project today.”, followed by a giggle. Then it auto played his favorite song when Dave said nothing in return, deadpan. Dave was mulling over a problem he had been carrying in his head since last night. Yeah, that was one of the things he was allowed to do until A.I. figured out to do of its own; speculating the exponential growth in technology, it will happen soon.
While the song reached its peak, Dave slingshotted himself off to a fridge, fetched various flavors of staple meal as much as he could clasp, and hurriedly trotted near his desk before he unintentionally dribbled everything on the matted floor. In the meantime, his workstation lit up automatically when it sensed him near. IDPA had recognized Dave, booted his monitor instantly or it just reappeared magically as if someone tossed it in the realm of the 3D from the unknown 4D world, while he was busy tearing off one of the staple meals he just lugged.
After some time when his mouth was full enough to chew some of it, he transferred the bowl in one hand and threw the remaining meal bags on the other side of the desk. Then buried himself in the ergonomic chair, chewing vigorously raised one hand with the palm facing the drabbed wall, like a rocket flies off the surface, only to stop the music.
He turned his face squarely towards the computer screen, affixed to the table. Then stared at it for long enough, making the surroundings blurry after a while, almost nonexistent as if he was floating on a cloud.
He was alone. Working From Home Forever.

 

AS USUAL, HE STUFFED the remaining meal into his mouth and set the bowl away, literally slid it far across the desk to never grab it again. Dave began his day by skimming over unread messages first, in a nick of time, as he had already linked IDPA to a universal messaging gateway.
Back in the old days, when the poorest of the poor were given the affordable internet, it became obvious that accumulating such a huge data by just one company would have been hazardous. Thence the market leaders from various countries back then were the first to realize that something needs to be done to avoid the draconian future. So these early liberators decided to come to terms with it and had discerned of creating the universal platform to amalgamate the disintegrated tribes. Eventually, their efforts gave birth to the universal messaging gateway. It had been developing as the open project since then, mainly consisted of a mixture of open APIs and a distributed file system at its helm. This meant, no one company would be held accountable for owning the public data, not even the government by any means.
With such an architecture at its core, even the smallest entrepreneur delivering weed at home could use the messaging gateway to notify its customers, for anything. The weed delivery Page in ARG, however, has to provide hooks for the universal messaging gateway to pull in real-time updates and notify those customers in a scalable manner. Later similar strategy was used with IDPA so that any kind of requests could be made without even installing the Page. Just give a command to your IDPA and boom!

The Universal Messaging Gateway now happened to be the common but much powerful interface to access all sorts of messages coming from family members, friends, colleagues, ITS, IoT home appliances, and all sorts of fairy things mounted around the room of Dave. And from strangers too—spam detection was an unsolved problem. It would worsen though, some said.
IDPA moved few important incoming messages on top automatically while the overwhelming majority of such messages were grouped separately that needed Dave’s consideration. Although, some experts prefer to check out each message manually by disabling IDPA (the icon resembles a bot head placed atop the human abdomen) in order to vindicate their amour-propre, but for Dave that just saved his time.
Predominantly, the incoming messages would be resolved instantly, in a moment as soon as they arrive, by IDPA itself only if they are project specific. But those replies were unerringly apt as if Dave himself were involved. Although, Dave could see such messages on the right side of the screen. He prodded on a flat sheet of a keyboard which was paired with ARG to close the list of auto-replied messages in order to shift his focus to the important ones in the center. Now went full screen with a gesture.
After a while, IDPA voiced sympathetically, “The injunction to be nice is used to deflect criticism and stifle the legitimate anger of dissent”.
It was one of the famous quotes fetched furtively over the wire when Dave was busy smashing keys. In this case, IDPA dictated him not be rude.

 

SPLAYED ACROSS THE BOTH EARDRUMS in a stream of steep hum was the reminder of an upcoming live code conference, happening at the luxurious resort. Dave supposed to be attending it but woefully caught up with urgency. Earlier he had watched such events remotely without any privation despite the fact that one had to be physically present at the venue to grab various sponsored tech goodies for free.
He welted on the notification to start a live stream that swiftly covered the preoccupied screen in ARG as if a black hole had swallowed a glittering star in oblivion.
Having himself competitively gazing at the live stream was not rendered on the real computer screen. The projection of the virtual computer screen, made via ARG (he was wearing), was of a shape of a glowing 3D rectangle. Having glowy, it was not fairy though. It looked almost real as if it was materialized there, and moreover, others could see it too if they were on the same Line. Further, he would stretch the screen to suit his needs. Sometimes he would transmogrify it into multiple screens for more arduous tasks. Inevitably, he would start the new project today, so a single screen. But wider.
Augmented Reality Goggle (ARG) was a small device of the shape of a cigarette, mounted behind both ears, which made this possible. It zip-tied to the top of his left and right auricles and connected by a thin wire from behind, someday to talk to the amygdala to send data via neural signals to enable brain-to-brain communications. High definition cameras and mics were attached to both aft and rear ends of the device so that a 360 video feed could be viewed or captured.
By the same token, ARGs were of equal stature to human eyes.

Ruffling his hair with left hand, Dave reposed himself in the chair when chattering noise coming from the remote conference advanced. Still, plenty of time remained to start a keynote though.
ARG had been upgraded beyond what it was during its nascent stage. Now you would beam your reality around you for others to experience in real-time. Similarly, the conference sponsors had broadcasted the whole conference hall in a 360-degree video feed that anybody—and most importantly from anywhere would tap into. That way, Dave could see everyone who was attending the conference physically as if he was with them. Even the conference attendees use ARGs during the live conference instead of watching directly with their naked eyes; mostly for fancy bits that you would not see otherwise.
Strips of imagery scrolled in the line of sight when Dave observed the conference hall in a coltish manner for some familiar faces. With a mere gesture, you could hop person to person to face them as if you were trying to make an eye contact to begin a conversation with. The only difference here was that the person on the other side would not know of until you sent a hi-five request.
Dave stiffened for a moment and looked for his best friend with the keen observation who was physically attending the conference this time. “There he is!”, Dave shouted aimlessly. Before sending him a hi-fi request, Dave flipped his camera feed by tapping on the computer screen to face him, and drew a blooming rectangle on it, starting with a pinch in the middle of the screen and then both fingers going away from each other, that captured his face down to his torso like a newspaper cut out. That was, as a matter of fact, the only subset from his untidy reality that would be broadcasted when conversing. He flipped back the screen to encounter his friend and sent him the hi-five. They retreated and discussed the technology in the midst of laughter and jokes until the conference began.
Subsequently, Dave shifted gears and colonized the chair and the escritoire in the room with his feet to watch the keynote for a few productive hours.

 

HOPPED OVER A CODE EDITOR when Dave reclined in the chair with the satisfied feeling after watching his favorite conference. Dave’s job was to find various ways to make IDPA and related AI machinery astute. On this day, however, he would spend his time building a Page a.k.a. ARG application.
The code editor opened all those files he left unclosed last night. Then it was overlapped with a small notification dialog about pending reviews. Dave resisted the urge to cancel the notification, for he knew that he would need complete focus for the new project starting today, to not leave his colleagues hanging in the air.
In the past, some people were worried about Software eating the world but lo and behold, software ate the Hardware too. Software as a Service was the norm back then but after few years some smart folks thought of Hardware as a Service and it was a game changer. The result of that, today, if you need a new machine to run any software, you just have to launch AHS Page (Augmented Hardware Services) in ARG and choose the likes of configurations you prefer in terms of RAM, Graphics card, HD display, Storage, and whatnot—Up and Running workstation in no time and far cheaper. After this setup, all you need was a high-speed Fiber Line which was pretty commonplace nowadays. ARG lets you connect to the Line which in turn allows you to interact with the workstation (and many things) in Augmented Reality. It is, in other words, the entire operating system and the likes of Pages you need at the moment, all run on the cloud and then projected in your field of view.
That way the Dave’s code editor was rendered too.

Dave engaged and finished with the review without further ado except at one place wherein according to Dave it needed a personal touch. So he wrote a polite explanation to prove his point with a mundane graph drawn using arrows (–>), dashes (—), dots (…), and attached the video recording along with it—meaning he just had an idea of making the program extensible and immune to future requirement (Still a dream). Then moved on to face the new instance of the code editor. Before he began thinking of the new project, IDPA prompted unconventionally in his ears, “Dave, I would like to inform you that you had forgotten to submit the review”. Dave quickly submitted it with embarrassment. With revengeful sense, Dave teased patiently to IDPA,

“What is life?”

“I know you are not interested in scientifically accurate answer”, rebutted IDPA in disguise after sensing Dave’s intention, “but to me (and you too), it’s a ToDo list!”.

Dave ought to take revenge, but instead defeated by the machine, he conceded his defeat and decided to divert his attention to the job at hand.

 

WHEN THE USE OF MACHINE LEARNING techniques soared, many programmers gave up on traditional UI/UX development and in fact focused on training the machines to do so. They long ago predicted, if it was achieved to the level of human intelligence, it would save time and money for many. Today, IDPA was not that profound when it comes to creatively lay out a design on an empty canvas on its own. Instead it relies a lot on existing designs and trends only to come up with a somehow similar but little bit different designs compared to the rest. Although, it’s not imaginatively creative at all but still, Dave was optimistic for having IDPA on his side today.
IDPA built into the code editor was given a simple command to fetch the designs for the new project from the cloud repository. The new project was about a newly launched Flying Commercial Vehicle which was as compact as a 4-seater electric car but flies up too without the need of big rotor systems. Dave was given the task of creating an ARG Page that must include various shots of the Vehicle from all angles, using which people around the world can make bookings from anywhere. When he fed those details to IDPA from within the code editor, it quickly brainstormed in nanoseconds and churned out a design which made Dave sad—not because it was bad but because it looked the same as hundreds of other Pages he had seen before. Most importantly, he did not like the placement of the snapshots and the details provided.
There was one more way although, especially, for those who still had some artistry left in them when it came to design Pages. He briskly drew boxes on a blank canvas in a circular form making up the circle of life (as if that’s the last thing people needed to complete them), filled some of them with random texts, and marked certain areas where the snapshots must be. Then fed his magnum opus to IDPA which produced the Page out of it instantly. On the same line, he drew few more things to capture the booking details and then have it Page‘d too. In the meanwhile, IDPA slapped some statistics on the screen, apart from some not-so-important maths, it showed file size of the compiled Pages vs the original design files. That made Dave sighed in satisfaction as if humans had contacted an Alien Race they could talk to.
Dave went through the created Pages just to read information about the vehicle, mentioned next to each angled shot. Looks like he was on fire today since ideas kept coming to him in order to make the current Page design even better. Now he could either go back to the drawing board or make edits in the existing Pages himself. He thought for a moment and decided to go with the latter option, that is, to open an Interface Builder. With a flick of a button it literally transformed the code editor UI into the interface builder UI, snapping a bunch of pallets on each side, only to assist him.
Dave focused on the current design to change it the way he intended and also fixed few design errors that the intelligent interface builder suggested, given the best practices and performance incentives. It was intelligent for a reason since it had added appropriate validations on the data fields automatically that supposed to capture the registration details of people who wanted to buy the vehicle. The only thing Dave had to do was connect the same to the cloud data storage which was the kid’s play. So he picked one of the few available cloud storage engines to save the data and pelted the finish button that in turn compiled the final ARG application that he then pushed live on the ARGStore which was the one-stop destination to host all sorts of ARG applications now.
This had become the reality now since browsers are long dead and the Web, if you may know, had been transfigured beyond recognition.

Dave saw IDPA holding up on some news since a long time, not to disturb him. He unmuted it only to know that some researchers cracked the way how Brain creatively thinks and he determined to let go of the thoughts he was mulling over since the previous night.
If you liked this story in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Rendezvous – sci-fi short story


This is my first attempt to write a science-fiction short story (the couple of them are still in progress 😙). I must acknowledge that I’m no Douglas Adams to hit a home run with the first story 😄. Nonetheless, it was exciting and tough as f***. My grand salute to all those science-fiction authors who have been writing countless pages with the integrity and the soul of the real science in their writing.

Please star or comment to provide your valuable feedback, whatever it may be. I’m immune to anything hurls at me 😅.

WE ARE ALONE IN THE UNIVERSE was the faith deeply rooted in as the planet earth became a utopian world. The world without poverty, corruption, wars, and maladies. The world now run by Gods called Humans.

He woke up to the silence for the final countdown. His body was numb and furthermore, the ticking wall clock seemed deliberately obtuse. Tick, Tock, Tick, Tock, Tick, Tock, Tick, Tock, Tick, Tock – Ten seconds have passed. The clock on his right glowing blurred and continuously blinking as if someone was around him that could not be seen with naked eyes. But he can sense it, feel it, believe it. Few minutes have passed but all he could hear was a ticking clock. Suddenly, lights splashed on him from all angles. He was lying on a bed in the hospital. Yuvon is dying!

Not a similar death by old age or illness like ancient humans. Going through euthanasia by his own will. He had lived prosperous life for straight 3 centuries. Today is his last day!

His best friend and a renowned scientist, Fa, who had worked with him earlier on many revolutionary projects sat next to hear his final words before Yuvon rests himself in peace. Yuvon’s last invention was a disgrace (according to him) that forced him to end his life. But he is still believed to be influential and his opinions are still taken seriously. He was gathering thoughts to search for the answers within, confusingly blabbered. “Fa, Have you ever thought of having some kind of life around us…?”

“…unseen to our blindfolded eyes?”, questioned to himself.

According to Fa, he was literally exaggerating things at hand, but that’s how he has been since eons – no one believed in him until he achieved the impossible and became the greatest of all. Yuvon is one of the brightest minds on the planet, was born as a child prodigy. He has been the proponent of genetically improving human mortality and cognition to make them real gods, rather than making humanoid robots. He believed imitating body functions is far easier than mimicking conscience. And consciousness to be exact. Later he led the massive massacre event in the late 21st century where all poor, ailing, corrupt, and incompetent people were genetically massacred. The only motive behind such gruesome act, unlike none, had been seen in the ancient history, was to reboot the humanity with a controlled population of the smartest and brightest humans only. With such appalling rules, within few centuries, the danger of existential crisis became the trouble of the past and humans were throned as Interplanetary species.

Since childhood, his inquisitive mind was occupied with mysteries of our existence in the universe and restive to find answers. The theory of evolution levied upon the humanity since ages were the little white lie to him so he always opined to find extraterrestrial civilization to prove himself wrong for a long while. For that matter, it turned out to be his life’s only mission or rather a fixation. With Yuvon’s sheer will, his team worked on the clandestine project for many decades. But consequently, his latest obsession to contact alien life in the vast and somewhat mysterious sea of stars had emerged as a complete fiasco.

While generations of efforts exhausted to make humans immortals, Yuvon’s move to meet the death – considered to be a disease nowadays – is unsettling for Fa’s befuddled brain. Before Fa reacts, Yuvon continued, “…We may not be aware of their presence.”

After a long silence hoping Yuvon has to say more. “What made you come to that conclusion?”, interrogated Fa by refuting the possibility, phlegmatically.

Yuvon knew that he was insane but not enough to let go, at least not when he has mere 10 minutes left. “We sent high-frequency radio signals, made mammoth machines to tap on to cosmic changes, detected hundreds of Earth-like planets within reach, propelled human ships there to look for some life forms. Indeed, we left no stones unturned to find alien life in outer space. What have we found so far…” He choked out of breathing. Then breathed deeply to confess further but decided to restrain himself.

Fa tried to console Yuvon’s franticness, “Nothing. But Isn’t that commendable what you have achieved so far? I have not seen you so desperate before. Where is your perseverance? If you choose to stay alive, who knows, we may find success in our research.”

Yuvon regressed, “It’s not about persistence. It’s about perception”. A spooky looking flying robot with chimpanzee-like long hands followed by sprawling fingers entered the cubicle. It had no legs or a torso. It literally flew over Yuvon’s lying body, a digital prompt popped up after scanning his face. While checking background details of Yuvon on the screen, Silicon-based eyes of the bot widen when he acknowledged Yuvon as its creator. With its spidery fingers, it initiated euthanizing Yuvon. His breathing returned to normal as the doctor prepared the preliminary procedures.

He continued impatiently, “As you know we humans can not live in water like Marine animals do. Would we ever have found marine life at all on earth if we had forgone searching oceans for life just because our own characteristics falling flat there? The whole problem with our research is we reckon that other life-forms may not exist in an environment where we can not survive. We have always looked for an alien planet that exhibits somewhat similar characteristics of our own planet, and that is the problem!”

Fa smiled and said evenly, “Certainly! But that’s the fastest way to find extraterrestrial life on billions of habitable planets around us?”

Yuvon stopped to ponder but swivel the debate to bring it on the track, then said firmly, “What sounds right is not always right. Ever thought that why there are so many different animals, insects, what not on this planet? It’s amusing not to think of having no conspiracy here.”

Fa bestirred for the first time, “What do you mean?”

“Maybe. All the life-forms including us that we see, touch, smell, and feel are artificially created by some sort of higher beings.”

“Why would they do that?”

By pointing his finger towards the doctor, he said thoughtfully, “Why have we created these bots? To help us right? I highly doubt that they left us here alone because no creator builds something that’s not useful to him. We must be useful to them somehow and presumably, they must be here around us in a different form. Watching us, Controlling us.”

“You mean, demigod??”, Fa questioned reluctantly to accept what Yuvon was trying to sell him.

“They may be ancient civilization, advanced themselves to be non-human, for the better or worse.”, rebutted Yuvon.

Fa desperately wanted to speed up the meaningful consternation knowing the less time at hand for Yuvon, “Why would any civilization do that? What does it have to do with your death?”.

Yuvon sensed his urgency and said confidently, “Once you invent a technology, there is no fun being a caveman!” Yuvon questioned rapidly to the perplexed Fa as he sensed his brain and heart about to shut down within a minute, “When you do a certain thing the same way, over and over again, you run by your muscle memory which makes you boring, monotonous, and essentially the owner of whatever you are doing because muscle memory was built on experiences you had in the past. But when you do the same thing differently each time, you run by your intuition which makes you interesting, creative, and essentially the puppet. Because suddenly some higher energy comes into play that feeds you that intuition you are no longer in control. Our senses are too restrictive for us to see them. Or we are not meant to see them. For a Shark to comprehend the vastness of the sea, it has to leave the box it is trapped in and come out only to perceive the grandeur of the seas.”

While Yuvon began to enter into the eternal slumber, the elfin grin emerged on his face as if he had solved the mystery, “Yes, Fa. What we can not see, does not deny its existence!”.

Before Fa reacts, Yuvon’s brain is about to halt forever. The tongue slipped, shuddered, faltering his words. He could not continue further as his heart stopped pumping permanently, but his brain was surprisingly spry. “Will Fa pursue what I devised? Will humanity heed my advice after my death? Will humans ever learn to see things differently? Or Will my words be forgotten forever?” He thought to himself while his brain went black.

His eyes shimmered for the last time and cool breeze took over his entire body, making it cold rapidly as if someone touched him.

If you liked this story in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.