Enable tree-shaking in Rails/Webpacker: A Sequel


A month ago, I wrote a blog post explaining a hacky way to enable tree-shaking in Rails/Webpacker project at Simpl. I would definitely recommend skimming through the previous post if you have not already.

In this post, we will directly jump into a more robust and stable solution. But before that, I want to resurrect my old memories for you that haunted me for months wherein a broken manifest.json was generated during webpack compilation at a random place. This time, after upgrading @rails/webpacker and related webpack plugins, the problem has been escalated beyond repair wherein an incomplete but valid manifest.json was generated randomly having fewer pack entries than expected. So even the generated manifest.json has little chance of succor by the hacky NodeJS fix_manifest.js script I had written to fix the broken JSON last time.

After a bit of googling my way out, I learned that webpack, with multi-compiler configurations, compiles each webpack configuration asynchronously and disorderly. Which is why I was getting an invalid manifest.json earlier.

Imagine two webpack compilations running simultaneously and writing to the same manifest.json at the same time:

{
  "b.js": "/packs/b-b8a5b1d3c0c842052d48.js",
  "b.js.map": "/packs/b-b8a5b1d3c0c842052d48.js.map"
}  "a.js": "/packs/a-a3ea1bc1eb2b3544520a.js",
  "a.js.map": "/packs/a-a3ea1bc1eb2b3544520a.js.map"
}

Using different manifest file for each pack

Yes, this is the robust and stable solution I came up with. First, you have to override Manifest fileName in every webpack configuration in order to generate a separate Manifest file for each pack such as manifest-0.json, manifest-1.json, and so on. Then, use the same NodeJS script fix_manifest.js with a slight modification to concatenate all the generated files into a final manifest.json which will be accurate (having all the desired entries) and valid (JSON).

For that, we have to modify the existing generateMultiWebpackConfig method (in ./config/webpack/environment.js) in order to remove the existing clutter of disabling/enabling writeToEmit flag in Manifest which we no longer need. Instead, we will create a deep copy of the original webpack configuration and override the Manifest plugin opts for each entry. The deep copying is mandatory so that a unique Manifest fileName can endure for each pack file.

const { environment } = require('@rails/webpacker')
const cloneDeep = require('lodash.clonedeep')

environment.generateMultiWebpackConfig = function(env) {
  let webpackConfig = env.toWebpackConfig()
  // extract entries to map later in order to generate separate 
  // webpack configuration for each entry.
  // P.S. extremely important step for tree-shaking
  let entries = Object.keys(webpackConfig.entry)

  // Finally, map over extracted entries to generate a deep copy of
  // Webpack configuration for each entry to override Manifest fileName
  return entries.map((entryName, i) => {
    let deepClonedConfig = cloneDeep(webpackConfig)
    deepClonedConfig.plugins.forEach((plugin, j) => {
      // A check for Manifest Plugin
      if (plugin.opts && plugin.opts.fileName) {
        deepClonedConfig.plugins[j].opts.fileName = `manifest-${i}.json`
      }
    })
    return Object.assign(
      {},
      deepClonedConfig,
      { entry: { [entryName] : webpackConfig.entry[entryName] } }
    )
  })
}

Finally, we will update the ./config/webpack/fix_manifest.js NodeJS script to concatenate all the generated Manifest files into a single manifest.json file.

const fs = require('fs')

let manifestJSON = {}
fs.readdirSync('./public/packs/')
  .filter((fileName) => fileName.indexOf('manifest-') === 0)
  .forEach(fileName => {
    manifestJSON = Object.assign(
      manifestJSON,
      JSON.parse(fs.readFileSync(`./public/packs/${fileName}`, 'utf8'))
    )
})

fs.writeFileSync('./public/packs/manifest.json', JSON.stringify(manifestJSON))

Wrap up

Please note that the compilation of a huge number of JS/TS entries takes a lot of time and CPU, hence it is recommended to use this approach only in a Production environment. Additionally, set max_old_space_size to handle the out-of-memory issue for production compilation as per your need – using 8000MB i.e. 8GB in here.

$ node --max_old_space_size=8000 node_modules/.bin/webpack --config config/webpack/production.js
$ node config/webpack/fix_manifest.js

Always run those commands one after the other to generate fit and fine manifest.json 😙

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Fantastic Beasts: The Crimes of Pythonworld


When I decided to make my foray into the Pythonic World, I stumbled upon the sorcery between system level Python2.7 and Python3. What are pip and pip3? I used to install some Python packages using pip and if not worked, used pip3. Either of them always worked 🙅‍♂️. I did not know what I was doing apart from just getting the system up and running until I determined to see how deep the rabbit hole goes. But the rabbit hole was not that deep, it was my confused mind that made it deep until now…

pip vs pip3

As you have already guessed that Python3 is a predecessor of Python2. In order to maintain the backward compatibility of the package manager pip for Python2, Python3 came up with its own package manager under the name of pip3. However, we can point python and pip commands directly to python3 and pip3 executables respectively (which we will see in the later sections), so that we do not have to deal with python3 or pip3 commands while running a python script or installing any python package.

The upshot is that pip by default points to the system level Python 2.7 and pip3 points to whatever version we have for Python3.

⇒  pip --version
pip 19.0 from (python 2.7)
⇒  pip3 --version
pip 18.1 from (python 3.7)

To naively create an alias for python and pip commands to point to python3 and pip3, we can add following in our bash/zsh file and reload the shell to take its effect.

alias python=python3
alias pip=pip3
# BEFORE
⇒  python --version
Python 2.7.15
⇒  pip --version
pip 19.0 from (python 2.7)

# AFTER
source ~/.zshrc
⇒  python --version
Python 3.7.1
⇒  pip --version
pip 18.1 from (python 3.7)

This approach works, however, we constantly have to edit the bash/zsh file to switch between two or more versions of Python. Clearly, we can do better.

Introducing pyenv

Pyenv allows us to install and switch between multiple versions of Python.

pyenv versions

We can check what versions of Python are installed on our system with the following command. The * in the beginning represents the current Python version (System Python 2.7 in this case) the python and pip commands point to.

⇒  pyenv versions
* system
  3.6.3

pyenv install <version>

We can install any version of Python using the following install command. Note that the installation does not switch to the installed python version implicitly.

⇒  pyenv install 3.7.2
⇒  pyenv versions
* system
  3.6.3
  3.7.2

pyenv shell <version>

To manually switch to any Python version (only in the current shell), we can use this particular command. That means, killing the shell window would restore the Python version to the system level one. Here we have switched to Python 3.7.2 in the current shell.

⇒  pyenv shell 3.7.2
⇒  pyenv versions
  system
  3.6.3
* 3.7.2

Introducing pyenv-virtualenv

Now that we have fixed the problem of maintaining different versions of Python to be used in various Python Projects. The different but somewhat similar problem persists for Python packages too.

For example, imagine we have two Python projects running on top of Python 3.7.2 but using different versions of Django, 2.1.5 (latest) and 1.9. So installing both one after the other using pip install Django==2.1.5 and pip install Django==1.9 commands would override the 2.1.5 version with the 1.9 one. Hence, both projects inadvertently would end up using the same Django version which we do not want. That’s where Python Virtual Environments help.

There are many Python packages out there to manage our virtual environments and some of them are virtualenv, virtualenvwrapper, etc. Although, either is better or worse than others in some way. However, we are going to use pyenv-virtualenv which is a pyenv plugin using virtualenv under the hood.

pyenv virtualenvs

Similar to pyenv versions, this command shows us a list of virtual environments we have on our system. Below I have one virtualenv venv already created for Python 3.6.3.

⇒  pyenv virtualenvs
  3.6.3/envs/venv
  venv

pyenv virtualenv <environment-name>

Let’s create a virtual environment for Python 3.7.2. Now we can see the two virtual environments created but none of them are activated yet.

⇒  pyenv virtualenv venv-3.7.2
⇒  pyenv virtualenvs
  3.6.3/envs/venv
  3.7.2/envs/venv-3.7.2
  venv 
  venv-3.7.2 

pyenv activate <environment-name>

Let’s activate the virtual environment venv-3.7.2. The * in the beginning represents the activated virtual environment where Django will be installed.

⇒  pyenv activate venv-3.7.2
⇒  pyenv virtualenvs
  3.6.3/envs/venv 
  3.7.2/envs/venv-3.7.2 
  venv 
* venv-3.7.2 

First, we can confirm if Django is installed in the activated virtual environment. If not, we will install Django 1.9.

# BEFORE
⇒  pip list --format=columns
Package    Version
---------- -------
pip        19.0.1
setuptools 28.8.0

# AFTER
⇒  pip install Django==1.9
⇒  pip list --format=columns
Package    Version
---------- -------
Django     1.9
pip        19.0.1
setuptools 28.8.0

So far so good. Now we must verify whether we got the isolation for packages using pyenv-virtualenv that we wanted.

pyenv deactivate

To check that we can deactivate the current virtual environment. This command will restore Python to system level one. And pip list will now show all the global Python packages installed on our system. Notice that Django is not installed anymore since we got out of the venv-3.7.2 virtual environment.

⇒  pyenv deactivate
⇒  pyenv virtualenvs
  3.6.3/envs/venv 
  3.7.2/envs/venv-3.7.2 
  venv 
  venv-3.7.2 
⇒  pip list --format=columns
Package    Version
---------- -------
pip        9.0.1
setuptools 28.8.0
airbrake               2.1.0
aniso8601              4.1.0
arrow                  0.10.0
asn1crypto             0.24.0
attrs                  18.2.0
bcrypt                 3.1.6
bitarray               0.8.3
boto                   2.49.0
boto3                  1.9.83
.
.
.

Wrap up

As of now, pyenv and pyenv-virtualenv are serving me well. I hope that things will be stable going forward too. 🤟

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Managing CRON with Ansible


Setting up a CRON job manually is a child’s play so why am I writing about it? The reason I’m writing about it because of two main reasons:

  1. My experience of setting it up with Ansible
  2. Common mistakes I made which others can avoid

CRON

CRON daemon is a long-running process that executes commands at specific dates and times. This makes it easy to schedule activities like sending bulk emails. The CRON relies on a crontab file that has a list of scheduled commands to run. So we can manually add/edit/remove scheduled commands directly in the crontab file but this may induce bugs, especially when the list is too long. Ansible helps in deploying such CRON jobs effortlessly.

Ansible

Ansible is an IT automation and orchestration engine. It uses YAML file syntax for us to write such automation called Plays and the file itself is referred to as a Playbook. Playbooks contain Plays. Plays contain Tasks. And Tasks run pre-installed modules sequentially and trigger optional handlers.

Similarly, I have used a CRON module in Ansible to set up a task which configures a CRON job as follows:

- name: Run CRON job every day at 11.05 PM UTC
  become: yes
  become_method: sudo
  cron:
    name: "apache_restart"
    user: "root"
    hour: 23
    minute: 5
    job: "/bin/sh /usr/sbin/apache_restart.sh"
    state: present

Imagine there is a fictional CRON job to run Apache2 at the specified time every day – god knows why? But I made unfair mistakes initially while setting it up. Let us go step by step into each of those mistakes:

become and become_method

These flags are only necessary while running the job with sudo or any other privilege escalation method. In this case, I wanted to run /bin/sh /usr/sbin/apache_restart.sh command with sudo and I wished not to expect a password prompt that we usually get while running such commands manually. So the become flag prevents the password prompt.

In the beginning, I had forgotten to add these flags preventing the CRON job from executing the bash apache_restart.sh file as expected.

cron module

Ansible lets us use the pre-installed CRON module so that it will be far easy to setup CRON jobs. Although, by mistake, I had made CRON module an Ansible task as mentioned below.

- cron:
    name: "apache_restart"
    user: "root"
    hour: 23
    minute: 5
    job: "/bin/sh /usr/sbin/apache_restart.sh"
    state: present

As we learned before, only Tasks can run pre-installed modules. So Ansible instantly threw an error while deploying and I managed to save my face 🤦‍♂️

cron name

I thought that since I had already named the task, naming the CRON module will not be necessary. But I was embarrassed more than wrong. Because each time you deploy any changes to Ansible, without a CRON name, it will set up a new CRON job leaving the previous one as is. So I was literally restarting Apache2 thrice at a time. Remember, the CRON name works as a unique key to identify if any CRON job is already set up with the same name. If not, it will set up a brand new CRON job in the crontab file. Otherwise, override the existing one with new configurations.

state

The default state of the CRON job is present. Although to disable a particular CRON job, you change the state to absent and redeploy it via Ansible. I was using the state present without the CRON name that was creating multiple crontab entries on each deployment.

job

The job key takes the actual command that you want to run at a specific time/date. But make to use absolute command paths for brevity.

Wrap up

I also use tail -f /var/log/syslog and grep CRON /var/log/syslog commands to check the logs to make sure that CRON actually runs the bash file I specified.

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Kung Fu “Pandas”


I have planned to write an article per week on Data Engineering from the perspective of a beginner – the first one was Python for Frontend Engineers. Do not expect a proven pathway to become a Data Engineer someday because I do not have any strategy at the moment. I am just following my gut feeling to become proficient sooner than later. So consider these blog posts as my personal notes which may or may not be helpful to others.

So, Pandas is a data processing tool which helps in data analysis – meaning it provides various functions/methods to manipulate large datasets efficiently.

I am still learning pandas and will continue to explore its new features. The Pandas documentation is pretty self-explanatory so I will just give a glimpse of its powers in this article just like the trailer.

This is how you can import pandas and start using it right away.

Importing Pandas

import pandas as pd

pd.* # where * denotes all the supported methods

Pandas supports two data structures at the moment.

Series

The Series data structure represents a one-dimensional array with labels i.e. Python Dictionary. However, the data in the dictionary can be of any primitive types supported by Python.

Creating Series

sons_of_pandu = {
  'son1': 'Yudhishthira',
  'son2': 'Bhima',
  'son3': 'Arjuna',
  'son4': 'Nakula',
  'son5': 'Sahadeva'
}
pandavas_series = pd.Series(sons_of_pandu)
print(pandavas_series)
# Prints following in Jupyter Notebook
# son1   Yudhishthira
# son2   Bhima
# son3   Arjuna
# son4   Nakula
# son5   Sahadeva
# dtype: object

Changing Indices of Series

Sometimes we prefer to change the indexing for brevity. So here we can change the index of the series to Pandavas’ progenitors.

pandavas_series.index = ["Yama", "Vayu", "Indra", "Ashwini Kumara Nasatya", "Ashwini Kumara Darsa"] # Prints following in Jupyter Notebook
# Yama                   Yudhishthira
# Vayu                   Bhima
# Indra                  Arjuna
# Ashwini Kumara Nasatya Nakula
# Ashwini Kumara Darsa   Sahadeva
# dtype: object

Slicing Series

Slicing is really handy when glancing at a large dataset. We can also slice the series for an exploratory view as follows.

pandavas_series[0:2] # Prints first and second rows excluding the third
pandavas_series[1:]  # Prints all rows except the first one
pandavas_series[-2:] # Prints the last two rows only

Appending Series

It is very common to deal with different data sets in Pandas and the append method is just a compliment you can not ignore.

kauravas = ["Duryodhan", "Dushasana", "Vikarna", "Yuyutsu", "Jalsandh", "Sam", "Sudushil", "Bheembal", "Subahu", "Sahishnu", "Yekkundi", "Durdhar", "Durmukh", "Bindoo", "Krup", "Chitra", "Durmad", "Dushchar", "Sattva", "Chitraksha", "Urnanabhi", "Chitrabahoo", "Sulochan", "Sushabh", "Chitravarma", "Asasen", "Mahabahu", "Samdukkha", "Mochan", "Sumami", "Vibasu", "Vikar", "Chitrasharasan", "Pramah", "Somvar", "Man", "Satyasandh", "Vivas", "Upchitra", "Chitrakuntal", "Bheembahu", "Sund", "Valaki", "Upyoddha", "Balavardha", "Durvighna", "Bheemkarmi", "Upanand", "Anasindhu", "Somkirti", "Kudpad", "Ashtabahu", "Ghor", "Roudrakarma", "Veerbahoo", "Kananaa", "Kudasi", "Deerghbahu", "Adityaketoo", "Pratham", "Prayaami", "Veeryanad", "Deerghtaal", "Vikatbahoo", "Drudhrath", "Durmashan", "Ugrashrava", "Ugra", "Amay", "Kudbheree", "Bheemrathee", "Avataap", "Nandak", "Upanandak", "Chalsandhi", "Broohak", "Suvaat", "Nagdit", "Vind", "Anuvind", "Arajeev", "Budhkshetra", "Droodhhasta", "Ugraheet", "Kavachee", "Kathkoond", "Aniket", "Kundi", "Durodhar", "Shathasta", "Shubhkarma", "Saprapta", "Dupranit", "Bahudhami", "Yuyutsoo", "Dhanurdhar", "Senanee", "Veer", "Pramathee", "Droodhsandhee", "Dushala"]
kauravas_series = pd.Series(kauravas)
pandavas_series.append(kauravas_series) # Prints following in Jupyter Notebook
# Yama                   Yudhishthira
# Vayu                   Bhima
# Indra                  Arjuna
# Ashwini Kumara Nasatya Nakula
# Ashwini Kumara Darsa   Sahadeva
# 0                      Duryodhan
# 1                      Dushasana
.
.
.
# Length: 106, dtype: object

Dropping from Series

Pass the index to drop any row from the series.

pandavas_series.drop('Yama') # Prints following in Jupyter Notebook
# Vayu                   Bhima
# Indra                  Arjuna
# Ashwini Kumara Nasatya Nakula
# Ashwini Kumara Darsa   Sahadeva
# 0                      Duryodhan
# 1                      Dushasana
.
.
.
# Length: 105, dtype: object

Dataframes

The Dataframe data structure represents a two-dimensional list with labels i.e. Python List.

Creating Dataframe

sons_of_pandu = [{
  'name': 'Yudhishthira',
  'progenitor': "Yama"
}, {
  'name': 'Bhima',
  'progenitor': "Vayu"
}, {
  'name': 'Arjuna',
  'progenitor': "Indra"
}, {
  'name': 'Nakula',
  'progenitor': "Ashwini Kumara Nasatya"
}, {
  'name': 'Sahadeva',
  'progenitor': "Ashwini Kumara Darsa"
}]
df_pandavas = pd.DataFrame(sons_of_pandu)

Head’ing DataFrame

df_pandavas.head()  # returns first 5 rows
df_pandavas.head(3) # returns first 3 rows

Tail’ing DataFrame

df_pandavas.tail()  # returns last 5 rows
df_pandavas.tail(3) # returns last 3 rows

Sorting DataFrame

df_pandavas.sort_values(by="name")

Slicing DataFrame

df_pandavas[0:2] # Prints first and second rows excluding the third
df_pandavas[1:]  # Prints all rows except the first one
df_pandavas[-2:] # Prints the last two rows only
df_pandavas[["name"]] # Prints all rows with "name" column only

Copying DataFrame

df_pandavas_in_alternate_dimension = df_pandavas.copy()

Wrap up

That’s it. There are more to Pandas than mere slicing/merging/copying/sorting. You can easily read/write CSV/XL files in Pandas like never before. Head over to Pandas Documentation for more information.

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Frontend Frontier – sci-fi short story


This is my 2nd science-fiction short story (the 1st one is here). It is a tribute to the Frontend Engineers and how their life and job would change in the future due to the advent of ML/AI.

Special thanks to Aamna, Roshan, and Abhishek for grammatical/punctuation improvements and the overall constructive feedback.

Please star or comment to provide your valuable feedback, whatever it may be. I’m immune to anything hurls at me 😅.

IN A VACANT BUT CHAOTIC CABIN, a soul, deep in reverie entered. The door plate had a caption that read, “Yesterday’s gadgets were in front of you, Today’s gadgets are in your hands, and Tomorrow’s gadgets will be right within you”. That was the most influential gift his grandpa had given him at the age when he knew the only use of a tongue in his mouth was to tease others. Now, he was in his late twenties and worked part-time as an A.I specialist for many years. Dare not to gaze in awe since everybody was an A.I. specialist these days, mostly due to high demand and high wages in return compared to other jobs or sarcastically no jobs. He recalled the “There is an APP for that!” story often shared by his grandpa to justify the current situation. Because of the rampant use of Machine Learning tools and techniques, making something smarter was a piece of cake now. Off-course, you’ve to dig deeper to improve the existing Machine Learning models or invent your own for an unprecedented result.
Dave was an autodidact who inhabited his working space. The room was furnished with the aesthetically minimalistic design which gave the impression of everything fairy. One side of the room, the door facing wall was painted pinkish blue, only to warp it nicely or rather gel with other drabbed walls. Silvery photo frame, containing meaningless photos in baker’s dozen, was planted on one of the drabbed walls in a shape of Kalpataru tree. It was connected to the Line (was called Internet before) as pictures that were framed-in changed randomly or perhaps as per the mood and emotions emanated by the people around. It turned out that even the walls changed colors and patterns by injunction.
There was no source of open vents, but still, it was quite frigid in there for anybody to retreat on chaise longue—cemented far away at the corner to rest for eternity.
Another wall, facing east had big nestled sliding windows for sunlight and thereby for warmth to slip through. As usual, the Sun had shone on the sliding windows from outside like a stage lighting. Looking down with half-shut eyes through the glass confirmed that the cabin was a part of 100-story construction erected from quite above the ground. Might be lodged on top of some flat structure below but could not be seen from up above. Even the surroundings were occupied with similar but sufficiently spaced out constructions.
Adjacent to the sliding windows, the stacked circular knobs controlled the artificial light—sunlight to be precise supposing the sun was not around—emerging from the glass to populate the room since no working hours were pre-defined.

Most of the things were wall mounted but the only exception was Dave’s workstation. Long gone were days when people in the past profoundly designated themselves as software engineers. In those days, their work and identity extended beyond mere programming software or so-called coding: analyzing project requirements, brainstorming with a team, creating prototypes, beta testing with handful of early adopters, fixing bugs, churning out new features, and many more such activities had made engineering the nail-biter job in a good way. But things had changed drastically and with unexpected swiftness since then.
Now they were called experts.
When people stopped doing traditional searches on various search engines back then, the obvious progression for many of those search engines was to become Digital Personal Assistants. Many could not fathom the idea and some died due to competition and so to speak complexity in giving accurate answers. Today, they collaborate or ironically fraternize themselves with Intelligent Digital Personal Assistant (IDPA) for any software related work as if humans and machines had switched roles. Even the schism between frontend and backend had become illusory.
Nowadays these experts had been trusting IDPAs more than their instincts when it came to problem-solving. By the same token, this probably might have made the software engineer term obsolete in this era.

When Dave entered the room, IDPA wished him welcome, “Good Morning, Dave!” and reminded in a soothing female voice, “I’m stoked to work alongside you on a new project today.”, followed by a giggle. Then it auto played his favorite song when Dave said nothing in return, deadpan. Dave was mulling over a problem he had been carrying in his head since last night. Yeah, that was one of the things he was allowed to do until A.I. figured out to do of its own; speculating the exponential growth in technology, it will happen soon.
While the song reached its peak, Dave slingshotted himself off to a fridge, fetched various flavors of staple meal as much as he could clasp, and hurriedly trotted near his desk before he unintentionally dribbled everything on the matted floor. In the meantime, his workstation lit up automatically when it sensed him near. IDPA had recognized Dave, booted his monitor instantly or it just reappeared magically as if someone tossed it in the realm of the 3D from the unknown 4D world, while he was busy tearing off one of the staple meals he just lugged.
After some time when his mouth was full enough to chew some of it, he transferred the bowl in one hand and threw the remaining meal bags on the other side of the desk. Then buried himself in the ergonomic chair, chewing vigorously raised one hand with the palm facing the drabbed wall, like a rocket flies off the surface, only to stop the music.
He turned his face squarely towards the computer screen, affixed to the table. Then stared at it for long enough, making the surroundings blurry after a while, almost nonexistent as if he was floating on a cloud.
He was alone. Working From Home Forever.

 

AS USUAL, HE STUFFED the remaining meal into his mouth and set the bowl away, literally slid it far across the desk to never grab it again. Dave began his day by skimming over unread messages first, in a nick of time, as he had already linked IDPA to a universal messaging gateway.
Back in the old days, when the poorest of the poor were given the affordable internet, it became obvious that accumulating such a huge data by just one company would have been hazardous. Thence the market leaders from various countries back then were the first to realize that something needs to be done to avoid the draconian future. So these early liberators decided to come to terms with it and had discerned of creating the universal platform to amalgamate the disintegrated tribes. Eventually, their efforts gave birth to the universal messaging gateway. It had been developing as the open project since then, mainly consisted of a mixture of open APIs and a distributed file system at its helm. This meant, no one company would be held accountable for owning the public data, not even the government by any means.
With such an architecture at its core, even the smallest entrepreneur delivering weed at home could use the messaging gateway to notify its customers, for anything. The weed delivery Page in ARG, however, has to provide hooks for the universal messaging gateway to pull in real-time updates and notify those customers in a scalable manner. Later similar strategy was used with IDPA so that any kind of requests could be made without even installing the Page. Just give a command to your IDPA and boom!

The Universal Messaging Gateway now happened to be the common but much powerful interface to access all sorts of messages coming from family members, friends, colleagues, ITS, IoT home appliances, and all sorts of fairy things mounted around the room of Dave. And from strangers too—spam detection was an unsolved problem. It would worsen though, some said.
IDPA moved few important incoming messages on top automatically while the overwhelming majority of such messages were grouped separately that needed Dave’s consideration. Although, some experts prefer to check out each message manually by disabling IDPA (the icon resembles a bot head placed atop the human abdomen) in order to vindicate their amour-propre, but for Dave that just saved his time.
Predominantly, the incoming messages would be resolved instantly, in a moment as soon as they arrive, by IDPA itself only if they are project specific. But those replies were unerringly apt as if Dave himself were involved. Although, Dave could see such messages on the right side of the screen. He prodded on a flat sheet of a keyboard which was paired with ARG to close the list of auto-replied messages in order to shift his focus to the important ones in the center. Now went full screen with a gesture.
After a while, IDPA voiced sympathetically, “The injunction to be nice is used to deflect criticism and stifle the legitimate anger of dissent”.
It was one of the famous quotes fetched furtively over the wire when Dave was busy smashing keys. In this case, IDPA dictated him not be rude.

 

SPLAYED ACROSS THE BOTH EARDRUMS in a stream of steep hum was the reminder of an upcoming live code conference, happening at the luxurious resort. Dave supposed to be attending it but woefully caught up with urgency. Earlier he had watched such events remotely without any privation despite the fact that one had to be physically present at the venue to grab various sponsored tech goodies for free.
He welted on the notification to start a live stream that swiftly covered the preoccupied screen in ARG as if a black hole had swallowed a glittering star in oblivion.
Having himself competitively gazing at the live stream was not rendered on the real computer screen. The projection of the virtual computer screen, made via ARG (he was wearing), was of a shape of a glowing 3D rectangle. Having glowy, it was not fairy though. It looked almost real as if it was materialized there, and moreover, others could see it too if they were on the same Line. Further, he would stretch the screen to suit his needs. Sometimes he would transmogrify it into multiple screens for more arduous tasks. Inevitably, he would start the new project today, so a single screen. But wider.
Augmented Reality Goggle (ARG) was a small device of the shape of a cigarette, mounted behind both ears, which made this possible. It zip-tied to the top of his left and right auricles and connected by a thin wire from behind, someday to talk to the amygdala to send data via neural signals to enable brain-to-brain communications. High definition cameras and mics were attached to both aft and rear ends of the device so that a 360 video feed could be viewed or captured.
By the same token, ARGs were of equal stature to human eyes.

Ruffling his hair with left hand, Dave reposed himself in the chair when chattering noise coming from the remote conference advanced. Still, plenty of time remained to start a keynote though.
ARG had been upgraded beyond what it was during its nascent stage. Now you would beam your reality around you for others to experience in real-time. Similarly, the conference sponsors had broadcasted the whole conference hall in a 360-degree video feed that anybody—and most importantly from anywhere would tap into. That way, Dave could see everyone who was attending the conference physically as if he was with them. Even the conference attendees use ARGs during the live conference instead of watching directly with their naked eyes; mostly for fancy bits that you would not see otherwise.
Strips of imagery scrolled in the line of sight when Dave observed the conference hall in a coltish manner for some familiar faces. With a mere gesture, you could hop person to person to face them as if you were trying to make an eye contact to begin a conversation with. The only difference here was that the person on the other side would not know of until you sent a hi-five request.
Dave stiffened for a moment and looked for his best friend with the keen observation who was physically attending the conference this time. “There he is!”, Dave shouted aimlessly. Before sending him a hi-fi request, Dave flipped his camera feed by tapping on the computer screen to face him, and drew a blooming rectangle on it, starting with a pinch in the middle of the screen and then both fingers going away from each other, that captured his face down to his torso like a newspaper cut out. That was, as a matter of fact, the only subset from his untidy reality that would be broadcasted when conversing. He flipped back the screen to encounter his friend and sent him the hi-five. They retreated and discussed the technology in the midst of laughter and jokes until the conference began.
Subsequently, Dave shifted gears and colonized the chair and the escritoire in the room with his feet to watch the keynote for a few productive hours.

 

HOPPED OVER A CODE EDITOR when Dave reclined in the chair with the satisfied feeling after watching his favorite conference. Dave’s job was to find various ways to make IDPA and related AI machinery astute. On this day, however, he would spend his time building a Page a.k.a. ARG application.
The code editor opened all those files he left unclosed last night. Then it was overlapped with a small notification dialog about pending reviews. Dave resisted the urge to cancel the notification, for he knew that he would need complete focus for the new project starting today, to not leave his colleagues hanging in the air.
In the past, some people were worried about Software eating the world but lo and behold, software ate the Hardware too. Software as a Service was the norm back then but after few years some smart folks thought of Hardware as a Service and it was a game changer. The result of that, today, if you need a new machine to run any software, you just have to launch AHS Page (Augmented Hardware Services) in ARG and choose the likes of configurations you prefer in terms of RAM, Graphics card, HD display, Storage, and whatnot—Up and Running workstation in no time and far cheaper. After this setup, all you need was a high-speed Fiber Line which was pretty commonplace nowadays. ARG lets you connect to the Line which in turn allows you to interact with the workstation (and many things) in Augmented Reality. It is, in other words, the entire operating system and the likes of Pages you need at the moment, all run on the cloud and then projected in your field of view.
That way the Dave’s code editor was rendered too.

Dave engaged and finished with the review without further ado except at one place wherein according to Dave it needed a personal touch. So he wrote a polite explanation to prove his point with a mundane graph drawn using arrows (–>), dashes (—), dots (…), and attached the video recording along with it—meaning he just had an idea of making the program extensible and immune to future requirement (Still a dream). Then moved on to face the new instance of the code editor. Before he began thinking of the new project, IDPA prompted unconventionally in his ears, “Dave, I would like to inform you that you had forgotten to submit the review”. Dave quickly submitted it with embarrassment. With revengeful sense, Dave teased patiently to IDPA,

“What is life?”

“I know you are not interested in scientifically accurate answer”, rebutted IDPA in disguise after sensing Dave’s intention, “but to me (and you too), it’s a ToDo list!”.

Dave ought to take revenge, but instead defeated by the machine, he conceded his defeat and decided to divert his attention to the job at hand.

 

WHEN THE USE OF MACHINE LEARNING techniques soared, many programmers gave up on traditional UI/UX development and in fact focused on training the machines to do so. They long ago predicted, if it was achieved to the level of human intelligence, it would save time and money for many. Today, IDPA was not that profound when it comes to creatively lay out a design on an empty canvas on its own. Instead it relies a lot on existing designs and trends only to come up with a somehow similar but little bit different designs compared to the rest. Although, it’s not imaginatively creative at all but still, Dave was optimistic for having IDPA on his side today.
IDPA built into the code editor was given a simple command to fetch the designs for the new project from the cloud repository. The new project was about a newly launched Flying Commercial Vehicle which was as compact as a 4-seater electric car but flies up too without the need of big rotor systems. Dave was given the task of creating an ARG Page that must include various shots of the Vehicle from all angles, using which people around the world can make bookings from anywhere. When he fed those details to IDPA from within the code editor, it quickly brainstormed in nanoseconds and churned out a design which made Dave sad—not because it was bad but because it looked the same as hundreds of other Pages he had seen before. Most importantly, he did not like the placement of the snapshots and the details provided.
There was one more way although, especially, for those who still had some artistry left in them when it came to design Pages. He briskly drew boxes on a blank canvas in a circular form making up the circle of life (as if that’s the last thing people needed to complete them), filled some of them with random texts, and marked certain areas where the snapshots must be. Then fed his magnum opus to IDPA which produced the Page out of it instantly. On the same line, he drew few more things to capture the booking details and then have it Page‘d too. In the meanwhile, IDPA slapped some statistics on the screen, apart from some not-so-important maths, it showed file size of the compiled Pages vs the original design files. That made Dave sighed in satisfaction as if humans had contacted an Alien Race they could talk to.
Dave went through the created Pages just to read information about the vehicle, mentioned next to each angled shot. Looks like he was on fire today since ideas kept coming to him in order to make the current Page design even better. Now he could either go back to the drawing board or make edits in the existing Pages himself. He thought for a moment and decided to go with the latter option, that is, to open an Interface Builder. With a flick of a button it literally transformed the code editor UI into the interface builder UI, snapping a bunch of pallets on each side, only to assist him.
Dave focused on the current design to change it the way he intended and also fixed few design errors that the intelligent interface builder suggested, given the best practices and performance incentives. It was intelligent for a reason since it had added appropriate validations on the data fields automatically that supposed to capture the registration details of people who wanted to buy the vehicle. The only thing Dave had to do was connect the same to the cloud data storage which was the kid’s play. So he picked one of the few available cloud storage engines to save the data and pelted the finish button that in turn compiled the final ARG application that he then pushed live on the ARGStore which was the one-stop destination to host all sorts of ARG applications now.
This had become the reality now since browsers are long dead and the Web, if you may know, had been transfigured beyond recognition.

Dave saw IDPA holding up on some news since a long time, not to disturb him. He unmuted it only to know that some researchers cracked the way how Brain creatively thinks and he determined to let go of the thoughts he was mulling over since the previous night.
If you liked this story in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Cautionary tale for Experts


Imagine a beginner (who has come from a decent front-end background) asking not-so-dumb questions to an expert in the field of Native application development. I hate the term “expert” because no one is purely expert in this field – every person is learning something new from others on a daily basis. But I’m quoting the term for the sake of the article. Here is how the conversation went:

Beginner: Is there a way to find an SDK version like we do with JS library?

Expert: What a dumb question! You do not understand native development that’s why you are asking it.

Beginner: Actually documentation does not have that details, hence asked.

Expert: Have you ever coded in Android/Java??

Beginner: Nope. Will I automatically get the new version of the SDK or have to update the manifest file each time?

Expert: Yes, automatic update!

Beginner: So I do not even have to run some sort of build command to pull in the new version like npm update does.

Expert: Please don’t over analyze the things that you don’t understand or don’t know. It wastes your as well as my time 🙂

Beginner: I’m not over analyzing, just trying to understand by comparing it with frontend development, but you can be little gentle solving such queries raised by the beginner like me.


As you can see the expert was trying to be arrogant by his knowledge and skills. To me it were acceptable if and only if he/she would have resolved my queries. Unfortunately, he/she had not answered any query in a sane way though. Una Kravets‘s recent tweet speaks about the same issue:

Through this post I just want to advice all the experts that,

Whatever high-skilled wits you’ve acquired were based on documentation, manuals, books, and articles written by someone else. You would not be the expert today if those experts had disparaged you like that. All the experts owe it to those they have learned from and the only thing to pay back the favor is by sharing your knowledge to the next generation without being cocky.

Remember, Sharing your knowledge with others does not make you less important.

Cautionary Tale for Angular Developers


Okay, now take a moment to watch this video

I’m sure after watching it, you may have realized that I’m not here to talk about why Neo falls into coma by stopping sentinels. Instead I’m going to talk about why my application got crashed the same way in Windows 8 RT webview and what I learned.

The Reality

An Application was built on top of Angular.js and Twitter Bootstrap for Desktop, Android, iOS, and Windows8. For touch devices, we’d created a Native wrapper around which loads the application in a webview. It looked fine in Android and iOS devices but was crashing in Windows8 RT upon opening a Bootstrap Modal popup. And I had no clue why Windows8 RT alone?

This is what I was using to open a modal window. I know its very bad to do the imperative DOM manipulation inside controller but as there were no directives bound to .modal I believed its safe to use it.

<br>
$('.modal').modal({backdrop: 'static'});<br>

But I was wrong. In Windows8 RT, the above line was throwing an error “Object [object Object] has no method ‘modal'”. My guess is that I must be executing bootstrap code in middle of $digest cycle or something.

The Fix

To fix this issue I could either move that code in a custom directive and toggle modal’s state or use Angular UI Bootstrap. I decided to go with the latter. Finally I learned a very hard lesson while fixing the issue that one should spend some time to get it right than just get it done.

Follow the best practices or get ready to be bitten.

So I’ve used AngularUI Bootstrap modal directive to instead of doing DOM manipulation in the controller.

<br>
var App = angular.module('App', ['ui.bootstrap']);<br>
App.run(function($rootScope) {<br>
    $rootScope.opts = {<br>
        backdrop: true,<br>
        backdropClick: false,<br>
    };</p>
<p>    $rootScope.modalOpen = false;<br>
});<br>

Finally updated the DOM as follows:

<br>
  &lt;button class="btn btn-primary btn-large" ng-click="modalOpen = true"&gt;Open Modal&lt;/button&gt;</p>
<p>  &lt;div class="modal hide" options="opts" modal="modalOpen"&gt;<br>
    &lt;div class="modal-header"&gt;<br>
      &lt;h3&gt;Modal Header&lt;/h3&gt;<br>
    &lt;/div&gt;<br>
    &lt;div class="modal-body"&gt;<br>
      Modal Body goes here...<br>
    &lt;/div&gt;<br>
    &lt;div class="modal-footer"&gt;<br>
      &lt;div class="btn" ng-click="modalOpen = false"&gt;Close&lt;/div&gt;<br>
    &lt;/div&gt;<br>
  &lt;/div&gt;<br>

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Setting up github like server locally using Gitblit


After a very long research, I recently come across an awesome opensource alternative to Github Enterprise. There are many of them including GitLab, SCM-Manager, GitStack (Windows), etc. but installation was very difficult and time consuming.

Introducing GitBlit

It is an open-source, pure Java stack for managing, viewing, and serving Git repositories. Gitblit GO is an integrated, single-stack solution based on Jetty. It bundles all dependencies so that you can go from zero to Git in less than 5 mins. I’m not going to go into the installation stuff as its already given on their website. Download Gitblit Go and follow the instruction for installation.

Customizing Gitblit

A few things to check in case you want to customize the behavior of Gitblit server by modifying gitblit/data/gitblit.properties

// Base folder for repositories
git.repositoriesFolder = /var/www/github

// Use local IP to make it accessible within LAN
git.httpBindInterface = 172.18.11.178

// Listen to port
server.httpPort = 1337

You can write a small bash script, start.sh in order to avoid “java -jar gitblit.jar” every time you run the gitblit server.

#!/bin/bash
java -jar gitblit.jar
$ ./start.sh
INFO  ***********************************************************
INFO              _____  _  _    _      _  _  _
INFO             |  __ \(_)| |  | |    | |(_)| |
INFO             | |  \/ _ | |_ | |__  | | _ | |_
INFO             | | __ | || __|| '_ \ | || || __|
INFO             | |_\ \| || |_ | |_) || || || |_
INFO              \____/|_| \__||_.__/ |_||_| \__|
INFO                        Gitblit v1.3.0
INFO  
INFO  ***********************************************************
INFO  Running on Linux (3.11.0-15-generic)

Open your browser to http://172.18.11.178:1337 depending on your chosen configuration and log in with admin/admin.

Creating Bare Repositories

Once logged in with admin, go to repositories and click new repository link. Enter repository name and description under General tab as shown below:

Creating Bare Repository
Creating Bare Repository

Then go to Access Permissions tab, add owners of the repository, change access restriction to “authenticated clone & push”, and tick allow authorized users to fork option – that means only authorized user can view/clone the repository and play with it. Finally give permission how users can interact with the repository. In this case, we give admin the RW+ (God) rights by clicking Add button. Lastly click Save to save the configuration. This will take you to a list of repositories screen where you’ll see our repo is listed showing admin as a owner.

Provide access to repository
Provide access to repository

You will be presented with few instructions after selecting our repository (http://172.18.11.178:1337/summary/coolapp.git) from the list. Do not worry about whats shown on the page as we only care about the repo URL.

Empty Repo
Empty Repo

Importing the Project

If you are using git locally for your project then feel free to skip below steps. Fire up the terminal and run given commands:

  1. Create an application folder
  2. Add project files
  3. Create an empty git repository to watch over the project
  4. Stage all files
  5. Commit all staged files
$ mkdir coolapp && cd coolapp
$ touch index.html
$ git init
$ git add index.html
$ git commit -m "First Commit"

As we have set up a git repository on a local machine, we need to link it up with the remote repository we’d created previously.

  1. Add remote
  2. Check the remote added (you should see what you’d added here)
  3. Push local commits on the server (you’ve to enter the password for admin)
  4. Go back to our web interface and refresh the page
$ git remote add origin http://admin@172.18.11.178:1337/git/coolapp.git
$ git remote -v
$ git push -u origin master

You should see the commit message we just pushed.

First Commit
First Commit

Expanding the Team

Its very rare that only one person works on a project and we often have to give access of the repository to other developers so that they can contribute.

As you know, admin can create new users so lets have a look at how we can assign new member to the project.
Go to users link on the top and click new user. Enter username and password. Allow him to fork the authorized repository. Finally select access permissions tab and provide appropriate repository permission. In this case, rockstar is only allowed to clone the coolapp repository which means he has to fork it in order to contribute. The single most benefit of the forking is that all the changes have to be scrutinized and validated before being landed into the main repository. Mostly a good option for Jr/Sr. developers.

New User
New User

New User Permissions
New User Permissions

Forking the repository

Once rockstar logs in, he will see all projects assigned to him. He can now choose the project and click “fork” button on the right.

Fork a Repo
Fork a Repo

Post fork, you can see the url has been changed to `http://rockstar@172.18.11.178:1337/git/~rockstar/coolapp.git` that means a new forked repository has been created on the server for rockstar user.

Cloning the forked repository

We just have created fork of the coolapp for rockstar user but in order to work on the project, he has to clone it as coolapp-rockstar on his workstation. Run the following command in terminal:

Post Forking
Post Forking

$ git clone http://rockstar@172.18.11.178:1337/git/~rockstar/coolapp.git coolapp-rockstar
$ cd coolapp-rockstar
$ git remote -v
origin  http://rockstar@172.18.11.178:1337/git/~rockstar/coolapp.git (fetch)
origin  http://rockstar@172.18.11.178:1337/git/~rockstar/coolapp.git (push)

As you can see remote points to the forked repository on the server but there is no way to fetch updates from the main repository. So lets add and we’ll call it upstream. For that you have to go to repositories interface and select the main coolapp repository.

Clone Repo
Clone Repo

10-main-repo-rockstar

$ git remote add upstream http://rockstar@172.18.11.178:1337/git/coolapp.git
$ git remote -v
origin  http://rockstar@172.18.11.178:1337/git/~rockstar/coolapp.git (fetch)
origin  http://rockstar@172.18.11.178:1337/git/~rockstar/coolapp.git (push)
upstream        http://rockstar@172.18.11.178:1337/git/coolapp.git (fetch)
upstream        http://rockstar@172.18.11.178:1337/git/coolapp.git (push)

Now your local branch has been linked to the forked and the main repositories on the server. You can fetch and merge the updates from the main repository into your forked (local) repository using below commands.

$ git fetch upstream
$ git merge upstream/master

Its a best practice not to work on master branch directly, instead create a new branch for each task. This is how rockstar can contribute to coolapp:

$ git checkout -b feature1
$ git commit -m "rockstar calls it a day"
$ git push -u origin feature1

The last command is extremely important as he did not merge his changes into local master but has pushed the new branch on the server (into forked repository).

Rockstar Pushed
Rockstar Pushed

Sigh, Pull Request

Unfortunately, the github like pull request feature is not landed in Gitblit yet so meanwhile rockstar has to email the link of feature1 branch to the owner for code review and further validation. You can right click the highlighted feature1 branch and copy the URL to email.

If an admin/owner has any suggestions on the commit then he can revert on the email for the same and rockstar has to make necessary changes and update the branch. Provide -uf option to force push in case you amend the changes into the last commit.

$ git push -uf origin feature1

This way admin or owner can pull the feature1 branch and merge into the main repository for release.

Merging Pull Requests

Now admin/owner on the other hand have to pull the branch and merge it into the main repository if everything is okay. First thing you need to create a new branch (use the same name to avoid confusion) and pull the changes into it. In case rockstar’s feature1 branch is old than the main master branch then you can make it uptodate by rebasing so that his commit will appear on top.

$ git checkout -b feature1
$ git pull http://admin@172.18.11.178:1337/git/~rockstar/coolapp.git feature1
$ git rebase master
$ git checkout master
$ git merge feature1
$ git push origin master

Once rockstar’s commit is landed into the main repository, you can copy the URL of the commit and revert back on the same email thread to notify him about it.
Rockstar's commit lands

Cleaning up the branches

Rockstar can delete both local/remote branches feature1 as soon as he received the acknowledgement from admin/owner about the merge. He can pull the updates in master branch and then get rid of feature1 local/remote branch.

$ git checkout master
$ git fetch upstream
$ git merge upstream/master
$ git branch -D feature1
$ git push -u origin :feature1

Good Night!

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

Do not let your reading habit die with Google Reader! Introducing After Reader!!


After Reader – Do not let your reading habit die with Google Reader!

This chrome extension points the reader link (top black bar on all google products) to your choice of feed reader. You can either choose from some existing readers or add your own.

small
crop

I and many more people have been using Google Reader and love it since years but unfortunately, Google has decided to shut it down. Luckily, we have some of the good alternatives available that we can use. But again, we can not access that from Google’s black top bar. And hence my effort goes not to break that habit.

Even after Google removes the reader link on 1st July onward, this extension will keep the link there but point to your choice of feed reader. Enjoy!

Installation

Just install the extension from the Chrome Web Store.

Source code available on Github

https://github.com/codef0rmer/after-reader

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.

AngularJS module for jQueryUI draggable and droppable


Last year, I played a lot with jQueryUI and mostly draggable/droppable along with AngularJS and it was a bit of a pain so I decided to write a directive for it which should make it easy for others to implement such functionalities.

Demos and much more

http://codef0rmer.github.com/angular-dragdrop/#/

Fork it on Github

https://github.com/codef0rmer/angular-dragdrop

and written test cases also 🙂
https://github.com/codef0rmer/angular-dragdrop/blob/master/test/index.html

Good Night!

If you found this article useful in anyway, feel free to donate me and receive my dilettante painting as a token of appreciation for your donation.