Author: Justin Lee

  • Hiring Singaporean Software Engineers

    Hiring Singaporean Software Engineers

    I’ve been hiring Software Engineers over the past few years in Singapore, and I notice a worrying trend.

    As a local business, I always want to hire Singaporeans first; I want to give our own young people a chance; I want to see our local SMEs develop a strong Engineering culture, but recently it has become really difficult.

    One of the reasons could be because good local Software Engineers have gone to apply for Tech Giants, MNCs, banks, GovTech, etc. where they can be paid reasonably well, get good benefits, etc.

    The unfortunate thing is that small businesses and startups receive the lower percentile of the cohort, and that sucks, because these people waltz in and ask for a pretty high salary “because industry standard”.

    Bullshit. The “standard” is based on what you can do, not what others can do.

    I recently put out a job advertisement on LinkedIn, and after about a week, I received around 100 applicants; there were only 3 Singapore/PR, 2 Malaysia, 5 Philippines, 6 Myanmar, 2 China and the rest are from (you probably guessed correctly) — India.

    As a local business, plus recent COVID-19 Government incentives to favour hiring of locals (JSS, JGI), I called up all the Singaporean candidates for interview, but was met with disappointment.

    Note1: All of the candidates whom I describe below (regardless of nationality) are mostly fresh graduates with 1 year or less of work experience.

    Note2: Also, just to be clear, this is not a blog post to diss all Singaporean tech graduates. Frankly most graduates I’ve met from local universities like NUS or NTU are decent. This blog post is targeted at the “COVID-19 cohort” I encountered this year.

    I usually put candidates through a fairly simple technical quiz. It is no where close to the likes of Google and Facebook, but is effective in weeding out candidates that lack basic technical knowledge. There’s no coding; just questions and verbal answers.

    What disappointed me the most was that despite having the upper hand in terms of Government support and language proficiency, Singaporean candidates came out weaker.

    The government has actually done a good job of enticing businesses to hire locals, because they are in effect subsidising the salaries of Singaporeans, and then making foreigners expensive by increasing the salary brackets, and imposing quotas and levies.

    In addition, I like to work with locals because you see, the primary language of Singapore is English. (This is different from our national language Malay, which is purely symbolic.) Singaporean candidates are usually very conversant in English, especially those born after the 1990s. The other nationalities — especially Myanmar or China — are not so lucky; English is not their primary language, but despite their struggles, they picked it up, travelled here, took a course (such as the Graduate Diploma from NUS-ISS, which I interviewed quite a bit of applicants from.)

    I commend the effort of candidates who try their best to explain a concept in a language they aren’t familiar with. I could tell that they knew for example what a particular algorithm is about, but are unable to explain in English. They try their best, and as a recruiter, you can tell if somebody knows their shit or not.

    However Singaporeans who are naturally strong in English tend to lack actual technical understanding of many things, and instead because of their fluency in English, they are able to respond quickly with guesses, a.k.a. “smoke” — sometimes rather intelligent guesses — but that’s not what a recruiter is looking for. We don’t hire you to tikam (guess). We expect you to know your stuff. Of course I don’t expect candidates to know everything, but the fundamentals must exist.

    Sometimes after a candidate answers incorrectly, or maybe tries really hard, or makes really close guesses to the correct answer, I will share the answer and explain why, etc. Some Singaporean candidates I have come across seem nonchalant to the fact that the recruiter had actually taken an effort to explain something, and they simply answer it with a rather monotonous “oh, OK.” The better candidates would usually ask more questions, or show that they had a eureka moment.

    The other trend I noticed is that Singaporeans tend to pick the swanky Degree courses, like Cyber Security. Whilst I have no qualms with somebody taking a keen interest in this area of specialisation, it is quite impossible to learn that in a 1 or 2 year part-time Degree without even having learnt any fundamentals.

    I asked one candidate why did he pursue a Degree course. The answer is exactly what I expected to hear: “A Degree is the minimum now.”

    Let me assure you it is not — at least not in the private sector. I have hired people who have completely irrelevant Degrees but taken a bunch of Udemy courses, taken it upon themselves to work on some personal projects, and proven that they knew their stuff. A Degree makes you expensive (because you would have priced your expectations so) and that actually makes you less employable. I much prefer a candidate who has work experience than a frivolous Degree.

    ~

    So with the benefit of English proficiency, plus not having to deal with the perils of living alone in a foreign country, why do these Singaporean candidates turn out weaker?

    Wrong school/course? Should we stop letting crappy private Degree programmes be taught in Singapore?

    Not paying attention in class? Have they gone through too many years of mundane Singaporean education and grown nonchalant, uninterested and bored?

    Too comfortable? Should we maybe instead yield a higher rate of unemployment so that there’s a pressure to get better?

    I really don’t know leh.

  • Hope for humanity

    Hope for humanity

    As a kid, I was fascinated by anything that could fly – planes, helicopters, rockets. It was a privilege to be able to fly on a plane back then, as it was still relatively expensive and only middle-high income families could afford.

    In my mid twenties (around 2006), I took up remote control helicopters and planes as a hobby. I still remember I spent almost half a year learning how to hover an R/C helicopter, practicing almost every night at my void deck.

    Me flying a Multiplex EasyGlider (first generation) at Bedok Reservoir, a common place for “sloping”. R/C gliders are flown without motor power and are kept afloat by air currents.

    I wanted to attach a camera to my plane so I could get a view from above, but I was too early to the hobby and the technology wasn’t really available or was too expensive. Wireless first-person view (FPV) cameras and headsets matured several years later. By then, I was a busy working adult and dropped the hobby.

    Almost a decade later, the technology had gotten so advanced we could buy it in a single package under $1,000 as a Drone that is not only self-stabilising, but also capable of obstacle avoidance and autopilot. I unexpectedly acquired a DJI Mavic Pro in late 2017 (a story for another day), but never really gotten around flying it. Now it is practically banned everywhere.

    So why the interest in flying? When I was young, I read about the first man on the moon. I’ve always wondered how it would be like to be “flying”, and to experience weightlessness in space. I was still too young to understand what a feat it was to get to the moon back then, but my child’s recent interest in space and our solar system piqued my own interest again, so I bought a telescope and started looking up to the skies again like a curious child.

    And for the first time in my life (and my lucky kid as well) at the age of 38, I got to see Jupiter and Saturn for real.

    Some people laugh and ask: What’s the big deal? These pictures suck. We have super clear pictures of Jupiter and Saturn all over the Internet.

    Sure, but it’s different to actually see it for yourself than to see a photo on the Internet. Looking through a telescope, you know these aren’t just something you read in the book – the planets are really out there. You also get a sense of how vast space is.

    You know, people dream of many things… big house, being rich, being powerful, this and that, but I salute those (like Elon Musk) who dream of the impossible: getting the heck out of Earth, because that is a whole other level of dreaming.

    Distance between the Earth and Moon, to scale.
    (Image credits: Wikipedia)

    If you followed the SpaceX Dragon Demo-2 mission, the journey to the ISS seemed like a big trip, but the moon (384,000km) in comparison is a thousand times further than the ISS (340km). Sure, most of the effort was actually to get out of Earth’s gravity, but it’s still a long ways to the moon.

    51 years ago (1969) we landed the first man on the moon. The Apollo rockets were (mostly) hand-made. This was before we even had the Internet, when most TVs were still black and white. There were no 3D printers, no modern computer design, modelling or simulation. Everything was calculated by hand. Yet we were able to send man to the moon and back. It was so unthinkable that there were conspiracy theories.

    With all the technological and manufacturing advancements, I wonder why haven’t we developed commercial space flight earlier?

    I certainly hope we get to see commercial space travel this decade, which looks to be a reality soon.

    Photo of the moon, taken from my bedroom with a Celestron 4SE telescope.

    I also hope to be alive to see the first man on Mars.

    As long as we keep trying, there’s still hope for humanity. The COVID-19 pandemic shows how important it is to be able to sustain life outside our own planet because one day another virus might just wipe all of us out.

    We need to stop fighting each other and work towards saving our own kind, and I believe that the answer is in space travel.

  • Three-piece next generation ERP unit

    Three-piece next generation ERP unit

    I read about the new 3-piece next generation ERP unit and must say I was a little bit disappointed. I applaud Transport Minister On Ye Kung for openly sharing the decision making process that lead to this design.

    As an auto enthusiast, I’ve had my fair share of IU unit failures and also personally installed various in-car accessories such as cameras, GPS, bluetooth, etc.

    I believe LTA has an expert panel, but I hope some of my opinions here can persuade the Transport Ministry to reconsider the following:

    Processing unit location

    The photo shows the unit installed on the passenger side and I certainly hope this will not be the case. It should be located under the driver’s dashboard.

    While I think it is a good idea to reduce heat by relocating the processing unit in the car, the amount of effort required to install a device across the center console can sometimes be tremendous – especially in modern cars where there’s so much equipment and cabling already in the car.

    If contractors do not wire it carefully, the wires can be tucked through areas where there may be sharp edges (e.g. metal brackets that hold the dashboard) which may lead to abrasion and eventually a wire breakage, or worse a short-circuit after years of heat and vibration.

    Also an important consideration would be cable entanglement during dashboard removal which is sometimes necessary for air-con repairs. I have seen (existing) IU wires running across from the left to the right because the power is tapped from a fusebox located on the passenger side. In situations like these, wire breakage can happen when the dashboard is removed. Luckily the existing IU uses a simple two-wire 12V DC supply. If the new ones contain a data cable (for the antenna + display), it may require a full cable replacement.

    Cabling madness in a modern BMW 5 series

    Adapter to fit Japanese vehicles that have ETC

    LTA should also consider making the in-car processing unit conform to, or be available with an adapter that may be compatible with Japanese Domestic Market vehicles with a built-in ETC device so it can be tucked away neatly in the space designed.

    Japanese ETC unit integrated into the vehicle dashboard (Credits: Wikipedia)

    It is also worth noting the small size of the 2-piece Japanese ETC 2.0 units.

    Japanese ETC unit is a 2-piece system with an external antenna and in-vehicle unit (Credits: Amazon)

    No touchscreen please

    I certainly do not encourage adding any more touch devices to distract drivers – especially taxi/private hire drivers. The automotive industry has moved away from touch to voice-control or gesture-control, and I would hope that LTA reinstate the static display – similar to that of the bike.

    Static display on a bike IU (Credits: LTA)

    No screen! Use Bluetooth + App!

    In fact, It would be a step up if the UI could be directly paired with a smartphone for the functions required and completely rely on a smartphone app. Similar to Parking.sg, wouldn’t it be awesome if I can top up and check my IU balance from my phone?

    Parking.sg app is a great move towards a paperless society (Credits: MND)

    If it can be app-enabled, then the screen can be made a removable or optional device. Simply unplug it, leaving the antenna.

    Minister Ong’s remark on smartphone: “there will be operational issues like battery running out, forgetting to bring smartphones, etc.” is also not quite valid, since most of us would have been able to charge our smartphone in our vehicles, but regardless, the IU should not be dependent on the smartphone, or vice-versa. The smartphone simply gives us access to more information where required. This is similar to many in-car cameras, where there are no displays and all actions are performed through an app.

    I understand there can be contractual obligations but if LTA can just get it right, we won’t have to go through ERP v4.0 in another 5-10 years. It’s a massive operation, and grandfathering bad designs can take a long, long time.

  • Why you SHOULD chope your seat, especially during COVID-19

    Chope culture. A uniquely Singaporean culture where diners at open food centers place small items such as tissue packets, business cards or water bottles to reserve a table or seat.

    I know many people are not in favour of this, and I personally used to hate it, but let me explain using the logic of parallelism that it is in fact more efficient. Everybody should chope their seats and split up and quickly order their meals, so that they can convene, eat, and leave ASAP – especially during this COVID-19 situation where half the seats have been crossed out for safe distancing. Quicker turnaround means more people get their seats, less community spread, etc.

    … using the logic of parallelism that (chope culture) is in fact more efficient

    The other reason to socially accept the “chope” culture is because those eating alone or carrying a child can find it very difficult to find a seat without a companion.

    So why do people hate it? Because “chope” is not gracious? Rubbish lah. It’s exactly because we are ungracious – when we want a seat, we feel a sense of entitlement to a vacant seat. We see an inanimate object, and feel that it doesn’t deserve a “seat”. Seriously, what’s the difference between a tissue packet and a fully grown man waiting at the table?

    … what’s the difference between a tissue packet and a fully grown man waiting at the table?

    With that, I shall share a true story…


    I went to ABC brickworks hawker this afternoon Tuesday 28 Jul 2020 at around 1PM to eat lunch with my colleague. We placed one tissue + Fisherman’s Friend on table to “chope” a 2 pax table, and quickly left to order our food.

    Then this father (50+) and his son (20+) duo came, sat down, and moved our “chope” away. (I saw with my own eyes as my stall is nearby.)

    Then when we returned, the son initially denied moving the tissue, then the father came and say our tissue never put in the center, cannot see properly, etc. etc. A big pack of lies.

    So I said: “Please lah, it was in the middle. I saw you move it, so please admit it.”

    OK, they left. Then we started eating.

    Eat halfway, the uncle came back and started scolding us. Say this is neighbourhood hawker, no such thing as reservation, etc. etc.

    I say, please lah, we “chope”, quickly order, quickly eat, quickly go. Isn’t it more efficient than one person sit, one person order and take turns?

    Then he started lecturing us, “You listen carefully (你听好好), I tell you (我跟你讲), this place is not CBD, not restaurant, this is neighbourhood, etc. etc.”

    A lot of hokkien arguments ensued (when I switch to hokkien, things are getting serious…)

    I said, look, so what you want? Limpeh eating my lunch halfway. Say sorry? OK, “sorry”. Now, leave and let me eat my lunch. Then you still not happy, what you want? Call police?

    Then after my meal, I turned around briefly looking for a stall to order drinks. Then I heard the uncle shouting from his seat: “Come lah, come come. Come!!” I didn’t even see them – they were sitting at another table. That’s when I realised they were behind, still wanting to pick a fight. ¯\_(ツ)_/¯


    We went, ate, and left all under 20 minutes. Including the 3 minutes or so spent arguing with the uncle.

    Please, “chope” is more efficient. Trust me, I’m an engineer.

  • Raspberry Pi 3.5″ Display X11 Configuration

    Raspberry Pi 3.5″ Display X11 Configuration

    I got a really cute 3.5″ TFT display for my RPi from Cytron which has an ADS7846 touchscreen controller.

    After installing the drivers (by following the instructions here), I realised that the X/Y axes of the touchscreen were flipped.

    It took me a while to figure this out, so I’m writing a blog entry as a documentation.


    SSH has been disabled by default on a fresh RPi OS installation. To use the RPi in headless mode with SSH enabled, create an empty file in /boot/ssh. This can be done via another computer, e.g. on my Mac:

    touch /Volumes/boot/ssh

    After the RPi boots, connect it to a LAN cable, SSH to it (find its IP address via DHCP server leases table) and then run raspi-config to do some initial configuration.

    raspi-config

    Now install the LCD drivers.

    To rotate the display so that the HDMI ports are on top. Edit /boot/config.txt:

    dtoverlay=tft35a:rotate=270

    Next, the touchscreen input will also have to be rotated accordingly. Edit /etc/X11/xorg.conf.d/99-calibration.conf:

    Section "InputClass"
      Identifier "calibration"
      MatchProduct "ADS7846 Touchscreen"
      Driver "evdev" # Force the evdev driver
      Option "Calibration" "3936 227 268 3880" # Default values
      Option "SwapAxes" "1" # Required for landscape orientation 
      Option "InvertX" "true" # Required for rotate=270
      Option "InvertY" "true" # Required for rotate=270
    EndSection

  • Mac and Dell monitor display quality issues over HDMI

    Mac and Dell monitor display quality issues over HDMI

    Update: 2020/12/01 for macOS Big Sur, please follow instructions here.

    If you have a MacBook, iMac, or Mac mini and use a HDMI cable to connect to a Dell monitor, you may notice that the image seemed over-processed as if it was over-sharpened – I noticed this when I connected my MacBook Pro to my Dell U2913WM.

    I found a setting in the monitor to turn the sharpness down from the default 50 to 0. It improved the image quality some, but when I switched back to my Mac mini (which was connected via DisplayPort), the quality was still better.

    Sharpness setting in the Dell monitor; setting to 0 made it look better, but still does not match a direct input via mDP

    It turns out that this is caused by the Mac sending video signals as YPbPr (component) instead of RGB. This can be seen in the Color Settings menu.

    Color Settings showing Input Color Format as YPbPr

    Why does this happen? When a HDMI display is connected, the display’s capabilities are negotiated using what is known as an EDID. For some reason, Apple/Mac decided to default to YPbPr, and there’s no way for us to select/force RGB.

    I found this blog, which also took a reference from another blog comment. TL;DR, download a small Ruby script and run it. The details are in those blogs, but I’m just writing it here for my own future reference.

    Steps:

    • Run the Ruby script. It will generate a folder with a file, e.g. DisplayVendorID-10ac/DisplayProductID-4080
    • Reboot your Mac in recovery mode (hold down Command+R)
    • Copy the file to /Volumes/Macintosh\ HD/System/Library/Displays/Contents/Resources/Overrides
    • If the folder exists, copy the file into the folder.
    • If the folder doesn’t exist, copy the entire folder.
    • If the file already exists, make a backup before overwriting it.
    • Reboot, and reconnect your display.
    Color Settings showing Input Color Format as RGB after reboot

    The difference is immediately noticeable. (I also had to restore my sharpness back to default: 50.)

    Displays in System Preferences showing EDID override.