##command line c++dev tools: Never phased out

Consider C++ build chain + dev tools  on the command line. They never get phased out, never lost relevance, never became useless, at least till my 70s. New tools always, always keep the old features. In contrast, java and newer languages don’t need so many dev tools. Their tools are more likely to use GUI.
  • — Top 5 examples similar things (I don’t have good adjectives)
  • unix command line power tools
  • unix shell scripting for automation
  • C API: socket API, not the concepts
  • — secondary examples
  • C API: pthreads
  • C API: shared memory
  • concepts: TCP+UDP, http+cookies
Insight — unix/linux tradition is more stable and consistent. Windows tradition is more disruptive.

Note this post is more about churn (phased-out) and less about accumulation (growing depth)


## stable base;fluid superstructure in G5 tech skills

My perspective in this post is mostly tech interview — accumulation, retention of knowledge, prevention of loss due to churn. As I age, I’m more selective what new technology to invest into.

In contrast to the interview perspective, the GTD perspective is dominated by localSys ! So churn doesn’t matter.

Many of these technologies are past their peak, though none is losing relevance. However, I tend to perceive these tech skills as robust and resilient, time-honored. Among them, I see a partial pattern — Many of them exhibit a stable base; some of them show a fluid superstructure of skillset.

  1. essential algorithms/data_structures and bigO — stable, possibly growing base + fluid superstructure
  2. java skillset — has a stable base i.e. coreJava + fluid ecosystem including jGC, jxee
  3. C/C++ — has a stable base skillset including TMP, STL… The superstructure is fluid mainly due to c++0x
  4. SQL and socket — each has a stable base
  5. pthread, unix internals ..– ditto
  6. unix (no QQ topics, but many GTD skills) — stable superstructure skillset including instrumentation, utils and scripting. Base? tiny.
  7. http stack — stable but tiny base including session/cookies + fluid superstructure

##longevity rating: java is outlier ] language_war

Among languages, java is way ahead of the pack. We need to treat java as exceptional, outlier. With that, c++ looks rather solid and decent. Shall we merge this table into my pastTechBet.xlsx sheet? Partially merged, but no need to update the sheet.

longevity rating #bias tech skill mkt share prominence domains
80 % java robust 2000’s
40 % py rising 2000’s
50 % c/c++ fell 1980’s HFT, gaming,
telco,embedded, AI/ML
20 % c#/dotnet fell 2010’s
30% php ? 2000’s
10% perl FELL 2000’s automation
40% javascript rising 2000’s
30 % RDBMS/sql fell 1990’s
70 % socket robust 1990’s probably
90% TCP/IP dominant 1970’s
20 % MOM robust
90 % http stack dominant 2000’s
90 % unix “tradition” dominant beyond memory

ruthless march@technology

There’s a concept of “best practices across industry”, as I experienced in Macq. Using new technology, things can be done faster, at a large scale, and more automated, even though I may feel it doesn’t make such a difference.

CTO’s don’t want to be seen as laggards. Same motivation at MS-Iceman, Quoine …

  • PWM-billing, PWM-comm. I remember Mark wanted “strategic improvement” not incremental improvement. He needs it for his promotion 政绩
  • RTS infrastructure was considered (by Jack He and outsiders) outdated and lagging behind competitors

You can call it “ruthless march of technology” — a ruthless progress. At a fundamental level, this “progress” can wipe out the promised benefit of “slow-changing, stable domain knowledge”

  1. quant skillset
  2. SQL skillset — affected by noSQL
  3. c++ skillset — perhaps affected by c++0x
  4. FIX skillset — perhaps affected by faster proprietary exchange APIs?
  5. … However, the skills above are still relatively robust. Other skillsets (they are not today’s focus) have proved arguably more robust against this march — sockets, pthread, STL, coreJava, bond math,.. I listed them in my spreadsheet pastTechBet.xlsx.

##specializations fac`same IV quizzes 20Y on#socket !! c++11

(tagline: the most churn-resistant specializations.)

Socket IV questions have remained unchanged for 20Y — Unmatched stability and churn-resistance, but not necessarily accumulating

  • Preservation of t-investment
  • Preservation of accumulation
  • Preservation of deep learning? Socket programming has modest depth.

Q: Beside the specialization of socket programming, are there other specializations that are dominated by the same old QQ questions 20 years on?

  • [S] classic data structures
  • [S] classic sort/search algorithms on int-array, char-array, list ..
  • [S] classic traversal algorithms on trees, general graphs
  • [s] classic recursive, DP, greedy algorithms beyond graphs
  • [S] pre-c++0x core-C++ (a specialization!) questions are largely unchanged. C++11 questions are rooted in the old knowledge base.. BUT most of the c++11 QQ topics will likely fall out of interview fashion
  • [s] cross-language concurrency primitives.
  • unix file/process/string/signal manipulation
  • unix shell scripting — low market value
  • [S] SQL — including join, index design … but seldom quizzed in depth nowadays
  • [S] regex — seldom quizzed, but often needed in coding
  • [S=classic, well-defined specialization]
  • [s=slightly less well-defined specialization]

Now the disqualified skills

  1. JGC + jvm tuning — high churn over 20Y
  2. TMP — new features introduced in c++11

criticalMass[def]against churn ] tech IV

See also body-building impact{c++jobs#^self-xx

IV knowledge Critical mass (eg: in core java self-study) is one of the most effective strategies against technology churn in tech interviews. Once I accumulate the critical mass, I don’t need a full time job to sustain it.

I have reached critical mass with core java IV, core c++ IV, swing IV (no churn) and probably c# IV.

The acid test is job interviews over a number of years.

Q: how strongly is it (i.e. critical mass) related to accumulation?
A: not much AFTER you accumulate the critical mass. With core java I did it through enough interviews and reading.

Q: how strongly is it related to leverage?
A: not much though Critical mass enhances leverage.

Q: why some domains offer no critical mass?
A: some (jxee) interviews topics have limited depth
A: some (TMP, py) interview topics have No pattern I could identify from interview questions.


c++changed more than coreJava: QQ perspective

Recap — A QQ topic is defined as a “hard interview topic that’s never needed in projects”.

Background — I used to feel as new versions of an old language get adopted, the QQ interview topics don’t change much. I can see that in java7, c#, perl6, python3.

To my surprise, compared to java7/8, c++0x has more disruptive impact on QQ questions. Why? Here are my guesses:

  • Reason: low-level —- c++ is more low-level than java at least in terms of interview topics. Both java8 and c++0x introduced many low-level changes, but the java interviewers don’t care that much.
  • Reason: performance —- c++0x changes have performance impact esp. latency impact, which is the hot focus of my target c++ employers. In contrast, java8 doesn’t have much performance impact, and java employers are less latency-sensitive.
  • Reason: template  —- c++0x feature set has a disproportionate amount of TMP features which are very hard. No such “big rock” in java.
    • move/forward, enable_if, type traits

Q: if that’s the case, for my career longevity, is c++ a better domain than java?
A: I’m still biased in favor or low-level languages

Q: is that a form of technology churn?
A: yes since the c++11 QQ topics are likely to show up less over the years, replaced by newer features.

##[17] proliferation → consolidation.. beware of churn

This is an extension of my 2015 blog post https://bintanvictor.wordpress.com/2015/03/31/some-of-the-worst-technology-churns-letter-to-tanko/

Imagine you spent months of serious personal effort [1] learning to use, debug, and tune, say, MongoDB but after this project you only find projects that need just superficial Mongo knowledge. Developer time-investment has no recurring return. I think this is widespread in tech: A domain heats up attracting too many players creating competing products with varying degrees of similarity. We wish these products are mostly similar so we developers’ time investment can help us more than once, rather than learn-n-forget like 狗熊掰棒子. Often it takes decades to see some consolidation among the competitors, when most of them drop out of the race and we one player emerges dominant, or a common standard [2] is adopted with limited vendor extensions.

Therefore I see two phases : Proliferation -> Consolidation. The churn in the early phase represents a hazardous pitfall.

If we invest too much learning effort there we get burned.

  • Javascript packages
  • JVM languages — javascript, Scala, Groovy, jython
    • I don’t even know which company uses them
  • ORM and database access “frameworks”–ADO.net, LINQ, EntityFramework, SpringJDBC,  iBatis,
  • Data Grid and NoSQL — Terracotta, Hazelcast, Gigaspace, Gemfire, Coherence, …
  • MOM — tibco, solace, 29west, Tervela, zeroc
  • machine learning?
  • web app languages
    • php, perl and the LAMP stack
    • Javascript MEAN stack
    • ASP and the Microsoft stack
    • Java stack

[1] You could have spent the time on personal investment, or something else if not required by the project.

[2] Some positive examples of standardization —

  1. RDBMS vendors
  2. Unix vendors
  3. c++ vendors — mostly GCC vs Microsoft VC++
  4. Java IDEs; c++/java/c# debuggers
  5. cvs, svn, git

A few development technologies “free” of proliferation pains —

  1. socket and system programming — complexities are low level and in C not c++
  2. core java
  3. SQL
  4. c/c++
  5. Unix know-how for testing, investigation, devops and process management
    1. shell scripting,
    2. regular expressions
    3. searching

##@55, Safer2b manager or hands-on dev@@

Hi Shanyou,

Based on your observations, when I reach 55, do you think it’s safer as a manager or a hands-on developer? “Safer” in the presence of

  1. competition from younger generation
  2. competition from same age group or older
  3. new, disruptive technologies
  4. technology obsolescence (what I call technology “churn”).
  5. outsourcing

Among these threats, my concern is primarily #1 but what about you?

fascinated by analysis@fail`tech brands

I tend to learn something (very indirectly) about competition, churn, focus, …

  • Nokia — refocused on its traditional strength in telecom infrastructure
  • Erickson
  • RIM
  • Yahoo
  • Novell

I feel my choice of java and c++ have been good, but regret my investment in dotnet, pthreads

I feel any dominant technology can get displaced. A few examples (not intended to be complete) — C/C++ (by java), RDBMS, Solaris (by Linux)

My tech bets for 2019-2020:

  1. system knowledge #CSY
  2. socket
  3. c++ TMP — a pure QQ domain for IV muscle-building
  4. c++ threading

big-data arch job market #FJS Boston

Hi YH,

My friend JS left the hospital architect job and went to some smaller firm, then to Nokia. After Nokia was acquired by Microsoft he stayed for a while then moved to the current employer, a health-care related big-data startup. In his current architect role, he finds the technical challenges too low so he is also looking for new opportunities.

JS has been a big-data architect for a few years (current job 2Y+ and perhaps earlier jobs). He shared many personal insights on this domain. His current technical expertise include noSQL, Hadoop/Spark and other unnamed technologies.

He also used various machine-learning software packages, either open-sourced or in-house, but when I asked him for any package names, he cautioned me that there’s probably no need to research on any one of them. I get the impression that the number of software tools in machine-learning is rather high and there’s yet an emerging consensus. There’s presumably not yet some consolidation among the products. If that’s the case, then learning a few well-known machine-learning tools won’t enable us to add more value to a new team using another machine-learning tool. I feel these are the signs of an nascent “cottage industry” in the early formative phase, before some much-needed consolidations and consensus-building among the competing vendors. The value proposition of machine-learning is proven, but the technologies are still evolving rapidly. In one word — churning.

If one were to switch career and invest oneself into machine-learning, there’s a lot of constant learning required (more than in my current domain). The accumulation of knowledge and insight is lower due to the churn. Job security is also affected by the churn.

Bright young people are drawn into new technologies such as AI, machine-learning, big data, and less drawn into “my current domain” — core java, core c++, SQL, script-based batch processing… With the new technologies, Since I can’t effectively accumulate my insight(and value-add), I am less able to compete with the bright young techies.

I still doubt how much value-add by machine-learning and big data technologies, in a typical set-up. I feel 1% of the use-cases have high value-add, but the other use cases are embarrassingly trivial when you actually look into it. I guess it mostly consist of

  1. * collecting lots of data
  2. * store in SQL or noSQL, perhaps on a grid or “cloud”
  3. * run clever queries to look for patterns — data mining

See https://bintanvictor.wordpress.com/2017/11/12/data-mining-vs-big-data/. Such a set-up has been around for 20 years, long before big-data became popular. What’s new in the last 10 years probably include

  • – new technologies to process unstructured data. (Requires human intelligence or AI)
  • – new technologies to store the data
  • – new technologies to run query against the data store

前辈civil engineer^old C programmers

Opening example — I shared with Wael… If you meet a regular (not a specialist) civil engineer aged 50, you respect and value his skills, but what about a C programmer of the same age? I guess in US this is similar to a civil engineer, but in SG? Likely to be seen with contempt. Key is the /shelf-life/ of the skill.

Look at Civil engineers, chemical engineer, accountant, dentists, carpenters or history researchers (like my dad). A relatively high percentage of those with 20Y experience are 前辈. These fields let you specialize and accumulate.

In contrast, look at fashion, pop music, digital media… I’m not familiar with these professions, but I feel someone with 20Y experience may not be 前辈. Why? Because their earliest experiences lose relevance like radioactive decay. The more recent the experience, the more relevant to today’s consumers and markets.

Now let’s /cut to the chase/. For programmers, there are some high-churn and some “accumulative” technical domains. It’s each programmer’s job to monitor, navigate, avoid or seek. We need to be selective. If you are in the wrong domain, then after 20Y you are just an old programmer, not a 前辈. I’d love to deepen my understanding of my favorite longevity[1] technologies like

  • data structures, algos
  • threading
  • unix
  • C/C++? at the heart or beneath many of these items
  • RDBMS tuning and design; SQL big queries
  • MOM like tibrv
  • OO design and design patterns
  • socket
  • interactive debuggers like gdb

Unfortunately, unlike civil engineering, even the most long-living members above could fall out of favor, in which case your effort doesn’t accumulate “value”.

– C++ is now behind-the-scenes of java and c#.
– threading shows limited value in small systems.

[1] see the write-up on relevant55

–person-profession matching–
A “accumulative” professional like medical research can 1) be hard to get in and 2) require certain personal attributes like perseverance, attention to details, years of focus, 3) be uninspiring to an individual. Only a small percentage of the population get into that career. (Drop-out rate could be quite high.)

For many years in my late 20’s I was completely bored with technical work, esp. programming, in favor of pre-sales and start-up. But after my U.S. stay I completely changed my preferences.

big data is!! fad; big-data technologies might be


My working definition — big data is the challenges and opportunities presented by the large volume of disparate (often unstructured) data.

For decades, this data has always been growing. What changed?

* One recent changed in the last 10 years or so is data processing technology. As an analogy, oil sand has been known for quite a while but the extraction technology slowly improved to become commercially viable.

* Another recent change is social media, creating lots of user-generated content. I believe this data volume is a fraction of the machine-generated data, but it’s more rich and less structured.

Many people see opportunities to make use of this data. I feel the potential usefulness of this data is somewhat /overblown/ , largely due to aggressive marketing. As a comparison, consider location data from satellites and cellular networks — useful but not life-changing useful.

The current crop of big data technologies are even more hype. I remember XML, Bluetooth, pen computing, optical fiber .. also had their prime times under the spotlight. I feel none of them lived up to the promise (or the hype).

What are the technologies related to big data? I only know a few — NOSQL, inexpensive data grid, Hadoop, machine learning, statistical/mathematical python, R, cloud, data mining technologies, data warehouse technologies…

Many of these technologies had real, validated value propositions before big data. I tend to think they will confirm and prove those original value propositions in 30 year, after the fads have long passed.

As an “investor” I have a job duty to try and spot overvalued, overhyped, high-churn technologies, so I ask

Q: Will Haoop (or another in the list) become more widely used (therefore more valuable) in 10 years, as newer technologies come and go? I’m not sure.

http://www.b-eye-network.com/view/17017 is a concise comparison of big data and data warehouse, written by a leading expert of data warehouse.

##some of the memorable technology churns #tanko


Here’s my expanded list of “worst” tech domains in terms of technology churn. Nothing but personal bias. For every IT professional, it’s his or her personal responsibility to identify these domains, and perhaps avoid investing (too much) into them.

  • —- not ranked
  • java generics QQ knowledge is a fad compared to c++ TMP
  • Object-oriented perl
  • javascript toolkits
  • GWT
  • silverlight
  • Gemfire, Coherence …
  • ADO.net
  • EJB, Weblogic
  • struts, spring integration
  • functional programming
  • Windowing GUI technologies – X-windows, PowerBuilder, Delphi, Borland c++, …
  • perl — is slowly being displaced by python, though bash scripting is robust
  • Javascript libraries like node.js, angular, jquery, GWT
  • ORM — product proliferation
  • MOM products — products proliferation like tibco, solace, 29west, Tervela, zeroc …
  • datagrid and noSQL — products proliferation
  • high-level integration
    • SOA, microservices?
    • web services, REST
    • EJB, RMI, RPC
    • JMS
  • anything to do with big data —
    • Map reduce? I hope Hadoop remains the standard
    • cloud?
    • machine learning — product proliferation
  • Web app development in general —
    • java web development including struts
    • Microsoft web development
    • PHP? I hope this is a bit more stable, but there are definitely new packages gaining popularity
  • anything on Windows
    • Powershell seems to challenge vbscript.
    • Windows administration – there seem to be many new utilities added every 5 years, replacing the old
  • anything on mobile
    • WAP
    • SMS based apps — used to be so popular in zed’s heydays
    • WindowsPhone, Symbian

–churn-resistant, robust techonologies

  • C/C++
  • C++ key libraries — STL, boost
  • socket, tcp/udp
  • Unix admin (relative to Windows admin) and scripting
  • core java i.e. at the core layer
  • SQL complex queries
  • DBA
  • Messaging architecture?
  • FIX
  • async architecture
  • http

##resilient WS tech: FIX,sh-script…

Background: the constant need to economize on my learning hours. Have to attempt to pick the “survivors”.

  • FIX as a tech skill is an example of unexpected resilience. (However, this is in a specialized domain, so competition is arguably lower.) FIX isn’t dominant. Exchange native API is faster. Many HFT shops don’t use FIX, but still FIX has good ROTI.
  • SQL is not longer dominant but still standard
  • Tibco isn’t dominant. Many teams use competing products. Still it’s resilient
  • XML, in the face of light-weight serialization protocols – json, protobuf
  • Bourne shell, in the face of python, perl…
  • STL, invented in the 80’s (??) with many limitations, challenged repeatedly
  • tibrv, challenged by 29west, solace,

I have a bias towards older technologies. They have stood the test of time.


2000 – 2002 are the first few years I spent in IT and had a deep impact on my outlook. However, there are many overstatements:

  • Too early to say — javascript had a surprise revival, even on Wall St! I have not decided to go back there.
  • Too early to say — perl was widely used on Wall St and was a key factor to my survival in GS.
  • SQL — skills I acquired in GS is not completely irrelevant. Many (financial etc) systems still use it. Perhaps less used on west coast in web 2.0 shops.
  • php — investment was not 100% lost. It did provide me a job at NBC. I think this is still a valuable skill on west coast. My php confidence is an asset.
  • mysql — investment was not completely lost. I would say my mysql experience gave me enough confidence and competence to take on other database systems.
  • apache — investment gave me valuable insight into network servers. I think apache is still widely used outside Wall St.
  • weblogic — investment was 90% lost but luckily I didn’t invest too much


low-churn professions often pay lower#le2Henry

category – skillist, gzThreat

I blogged about several slow-changing professions — medical, civil engineers, network engineers, teachers, quants, academic researchers, accountants (including financial controllers in banks).

My overall impression is, with notable exceptions, many of the slow-changing domains don’t pay so well. We will restrict ourselves to white-collar, knowledge intensive professions.

Sometime between 2013 to 2015, a tech author commented — compared to the newer languages of javascript, ruby, objective-C etc, java programmers are a more traditional, more mature, more stable, more enterprise community.

https://bintanvictor.wordpress.com/2014/11/03/technology-churn-ccjava-letter-to-many/ is my comparison of java, c#, c++. Basically I’m after the rare combination of

– mainstream,
– sustained, robust demand over 15 to 30 years
– low churn

Someone mentioned entry barrier. Valuable feature, but I think it is neither necessary nor sufficient a condition.

SQL and shell scripting are good examples. Very low churn; robust demand, mainstream. Salary isn’t highest, but decent.

##[15] once-valuable tech skills: personal xp

perl – lost to python
[$] tomcat, jboss. Weblogic lost most market share
apache, mysql
dns, unix network config
autosys – not used after GS
[$] sybase, oracle – after ML edge project I didn’t need it again.
[$] VBA – used once only. But Excel is going strong!

[$ = still has value]

–random list (not “top 10”) of longevity skills
eclipse, MSVS
Excel and add-in
javascript, though I don’t use it in my recent jobs
Linux shell
compiler knowledge
make, msbuild
bash and DOS batch scripting, even though powershell, vbscript and python/perl are much richer.

[14]technology churn: c#/c++/j QQ #letter2many

(Sharing my thoughts again)

I have invested in building up c/c++, java and c# skills over the last 15 years. On a scale of 1 to 100%, what are the stability/shell-life or “churn-resistance” of each tech skill? By “churn”, i mean value-retention i.e. how much market value would my current skill retain over 10 years on the job market? By default current skill loses value over time. My perl skill is heavily devalued (by the merciless force of job market) because perl was far more needed 10 years ago. I also specialized in DomainNameSystem, and in Apache server administration and Weblogic. Though they are still used everywhere (80% of web sites?) behind the scene, there’s no job to get based on these skills — these system simply works without any skillful management. I specialized in mysql DBA too, but except some web shops mysql is not used in big companies where I find decent salary to support my kids and the mortgage.

In a nutshell, Perl and these other technologies didn’t suffer “churn”, but they suffered loss of “appetite” i.e. loss of demand.

Back to the “technology churn” question. C# suffers technology churn. The C# skills we accumulate tend to lose value when new features are added to replaced the old. I would say dotnet remoting, winforms and linq-to-sql are some of the once-hot technologies that have since fell out of favor. Overall, I give c# a low score of 50%.

On the other extreme I give C a score of 100%. I don’t know any “new” skill demanded by C programmer employers. I feel the language and the libraries have endured the test of time for 20 to 30 years. Your investment in C language lasts forever. Incidentally, SQL is another low-churn language, but let’s focus on c#/c++/java.

I give C++ a score of 90%. Multiple-inheritance is the only Churn feature I can identify. Template is arguably another Churn feature — extremely powerful and complex but not needed by employers. STL was the last major skill that we must acquire to get jobs. After that, we have smart pointers but they seem to be adopted by many not all employers. All other Boost of ACE libraries enjoyed much lower industry adoption rate. Many job specs ask for Boost expertise, but beyond shared_ptr, I don’t see another Boost library consistently featured in job interviews. In the core language, until c++11 no new syntax was added. Contrast c#.

I give java a score of 70%. I still rely on my old core java skills for job interviews — OO design (+patterns), threading, collections, generics, JDBC. There’s a lot of new development beyond the core language layer, but so far I didn’t have to learn a lot of spring/hibernate/testing tools to get decent java jobs. There is a lot of new stuff in the web-app space. As a web app language, Java competes with fast-moving (churning) technologies like ASP.net, ROR, PHP …, all of which churn out new stuff every year, replacing the old.

For me, one recent focus (among many) is C#. Most interesting jobs I see demand WPF. This is high-churn — WPF replaced winforms which replaced COM/ActiveX (which replaced MFC?)… I hope to focus on the core subset of WPF technologies, hopefully low-churn. Now what is the core subset? In a typical GUI tool kit, a lot of look-and-feel and usability “features” are superstructures while a small subset of the toolkit forms the core infrastructure. I feel the items below are in the core subset. This list sounds like a lot, but is actually a tiny subset of the WPF technology stack.
– MVVM (separation of concern),
– data binding,
– threading
– asynchronous event handling,
– dependency property
– property change notification,
– routed events, command infrastructure
– code-behind, xaml compilation,
– runtime data flow – analysis, debugging etc

An illuminating comparison to WPF is java swing. Low-churn, very stable. It looks dated, but it gets the job done. Most usability features are supported (though WPF offers undoubtedly nicer look and feel), making swing a Capable contender for virtually all GUI projects I have seen. When I go for swing interviews, I feel the core skills in demand remain unchanged. Due to the low-churn, your swing skills don’t lose value, but swing does lose demand. Nevertheless I see a healthy, sustained level of demand for swing developers, perhaps accounting for 15% to 30% of the GUI jobs in finance. Outside finance, wpf or swing are seldom used IMO.

low-hang`@@perishable@@niche@@ #{2nd slow job search

(See also blog post http://bigblog.tanbin.com/2013/12/strategic5y-investment-in-tech-xx-again.html, )

Evaluation in terms of long term job security, demand, job search, job interview, salary level

^^ core c#
^ python? growing demand; low-hanging fruit; might open many golden gates at baml and jpm
^ FIX? low churn. Hard to self-learn
– socket? frequently quizzed, low churn
– wcf
v wpf? not my chosen direction at the moment. Too big a domain.
– linux/c++ optimization? too niche
^ option math, stoch? often asked in-depth, but few roles in SG
– fixed income math? not used on the buy-side
– risk mgmt math? stable demand
v quant strategy? vague, dubious domain, apparently zero role in SG

##Some Categories@IV questions are evergreen

Background – older developers often lament that generations of tech churn threaten our marketability/survival and devalues our expertise on an old generation of technology.

Key is the job interview. Think about the categories of tech questions. Some categories are immune to the churn… If you invest months (years? impractical) preparing for these questions, your investments may, counter-intuitively, endure. Here are some categories —

#1a) Algo + data structure? Yes. The primary focus of Top tech firm like FB, google, MS, Amazon. However, in coding tests they also need working code, therefore you need substantial coding experience with
** STL (or collections) +
** strings — non-trivial
** pointers

#1b) algo quizzes — dense recursion/loops (eg quicksort). Think fast and make it work. Hardcore coding ability. I think you don’t even need STL for this.

#2) Threading (only c/c#/java)? Not really un-changing. They do ask java thread pools new features.
** threading algorithms using standard constructs? Yes. often among the toughest IV questions.

SQL? Yes

OO principles as implemented in a specific language? Yes

Socket? not sure

Design patterns? No. a fad.

ADO.net^linq^EF #random notes

churn alert …

Inside Microsoft camp, I feel the 3 generations of dominant[1] DB access tools are ado, linq2sql and EntityFramework. I feel this is a churning field, so a lot of popular technologies will come and go. ADO/ODBC tends to survive longer. (My experience and focus is WPF/WCF…)

[1] Any time in the dotnet history, there would be “chims” challenging the gorilla. I don’t even know their names, though some may grow quite popular.

A) Ado.net is like the basic jdbc. ADO works with DataSet and DataTable. Better performance than any ORM (Linq/EF), because it doesn’t do mapping to objects.
http://msdn.microsoft.com/en-us/library/bb399365.aspx says There are three separate ADO.NET based LINQ technologies: LINQ to DataSet, LINQ to SQL, and LINQ to Entities. 1) LINQ to DataSet provides richer querying over the DataSet; 2) LINQ to SQL enables you to directly query SQL Server; 3) LINQ to Entities is less popular and probably more ambitious and complicated.

B) L2sql and EF both generate SQL queries, often unoptimized.

B1) The DataSet is a disconnected programming model in ADO.NET. L2DataSet is an  upgrade/enhancement/expansion

B2) LINQ to SQL is for developers who do not require mapping to a customized, conceptual model. LINQ to SQL enables developers to generate entity classes from DB tables. Rather than mapping to a conceptual data model, these generated classes map directly to database tables. You can easily mix LINQ to SQL code with existing ADO.NET applications

C) the competition between the simpler L2SQL vs the bigger EntityFramework.
LINQ to SQL is a simple and easy to use ORM system. It allows for only simple inheritance or association features. There was talk awhile back about Microsoft retiring it, but it is likely to be around for awhile.

EF is more suitable for complex object graphs. It allows for more complex inheritance or association features. EF shares many features with Nhibernate.

##classical engineering fields #swing@@

At 60 a mechanical engineer can be productive. These finance IT fields are similar. High value, tech-heavy and finance-light ==> Portable(!), optimization/engineering/precision ==> Detail-Oriented.

Socket – C, Non-blocking IO
Unix, syscalls, esp. kernel hacking – C
low latency – C
SQL tuning
DBA? new tools emerge
memory management – C/C++ only. Other languages are “well-insulated”

— these aren’t
windowing toolkit like motif, swing, wpf ? churn rate!
threading — new tools make old techniques obsolete

##[11]technologies relevant]2050 #inspired by architect story

Watching an in-depth documentary about an architect (I.M.Pei) in his 80’s, I started thinking (again) about what app dev technologies/experiences would be relevant when I turn 80. I think there will be more “winning” software tools (software are tools) adopted, each dominating a specific domain displacing old guards. But what about mainstream technologies? What are truly resilient in the face of destructive sea changes.

Note many of these technologies could be sidelined and dethroned but still relevant!

  • #1) C
  • #2) unix/linux
  • * [L] multi-threading basic constructs? All the basic low-level constructs are decades old, but not the high-level constructs
  • * [L] socket, tcp / ip
  • * unix, sql, network tuning
  • * [L] classic data structures and (only) those algo on them. STL was the pioneer.
  • * c++? less resilient than C, but since C++ compiler is usable by c developers, c++ features would be usable if not always relevant.
  • SQL and stored procedure coding?
  • [L] memory management — pointers, allocation/deallocations, definitely relevant to ultra high volume, low latency apps(?) More generally, In “demanding contexts” (Scott Meyers) I feel mem mgmt will remain extremely relevant, perhaps beneath the surface
  • * GUI? High churn. Requirement will stay but the constructs and the programming language/technique may change completely. GUI threading design seems to be consistent throughout.
  • * MOM architecture? probably yes but implementation may change so completely that your knowledge is utterly irrelevant. IBM MQ and RV are long-standing, largely due to the relevance of C.

# eor
* RPC, web service, corba, RMI… Resilient model, but not implementations
* [L] system calls? Actively used by few coders but relevant underneath the surface
* batch jobs? Requirement yes; implementation no.

[L=Low Level, closer to the metal, rather than application level]

which (vendor-)implementations to invest in

I may have talent in comp science theories. But I place my highest emphasis on familiarity with implementations — tools, utilities, products, packages, jars, building blocks — download-able stuff.

With theories, a talented student can progressively go deeper, higher, sharper, and in less time as her foundation strengthens and broadens. I feel less so with tools. Familiarity with implementations takes a hell of time.

For example, when something doesn’t work with something else, it often took me a few minutes to a few days. Therefore I have a strong aversion for new technologies and a strong affinity to mature, time-honored tools.

For example, countless designs work great on paper or during interviews. The #1 thing about any tool is the limitations you are likely to hit. Even a minor drawback can derail your entire project.

Some tools I’m investing into (either my old turf or new targets, the +++) and some (–) I’m divesting
++ c/java debuggers
++ sybase, oracle, mssql
+ rv, MQ (JMS is a spec, not an implementation)
+ unix sockets
+ eclipse
+ pthreads implemented in linux and solaris
+ python + solaris/linux latency tuning
+ pl/sql?
– spring / hibernate
– bash customization
– http
– mysql, php
– vi — needs 2 years. See [[Productive Programmer ]]
– any local system knowledge

##some java skills no longer]demand

(another blog post)

Any powerful technology requires learning. Whatever learning effort I put into EJB, Corba, struts, JSP, JNI, J2ME, Swing, AWT, rule engines .. are becoming less and less valuable to employers. The march of java is merciless. How merciless? Even the creator of java – Sun microsystem – has to go open source on both java and Solaris just to survive, and sell itself to Oracle.

I am now very careful not to invest myself too heavily into anything including popular must-know stuff like Spring, hibernate, distributed cache, EJB3, AOP, CEP, JSF, GWT, protobuf, web services, design patterns, RMI …

I think most of the above are jxee, not coreJava.

Instead, I seek stability in older core java technologies like threading, collections, serialization, messaging, reflection, …


(C++ is even older and more stable, but this post is all about java.)


  • Skillist — the leaf-level tech skills. This label is most relevant on old posts without label.
  • 5y_domainBet — helps me choose which industry sector to invest my time and accumulate.
    • Not necessarily a list.
    • 5Y horizon
  • 10y_dir — longer broader view
  • specialize — I’m a specialist type of professional, not a generalist or manager. These posts help me position myself, not necessarily restricting to a particular domain.
  • accu — less specific than “specialize”
  • churn
  • t_skillist and t_feature tags rarely but could be on one post