op-new: allocate^construct #placement #IV

Popular IV topic. P41 [[more effective c++]] has an excellent summary:

  1. to BOTH allocate (on heap) and call constructor, use regular q(new)
  2. to allocate Without construction, use q(operator new)
    1. You can also use malloc. See https://stackoverflow.com/questions/8959635/malloc-placement-new-vs-new
  3. to call constructor on heap storage already allocated, use placement-new, which invokes ctor

The book has examples of Case 2 and Case 3.

Note it’s common to directly call constructor on stack and in global area, but on the heap, placement-new is the only way.

Placement-new is a popular interview topic (Jump, DRW and more …), rarely used in common projects.

Advertisements

some TECHNICALITIES for QnA IV

We often call these “obscure details”. At the same level they are a small subset of a large amount of details, so we can’t possibly remember them all. Surprisingly, interviewers show certain patterns when picking which technicality to ask. Perhaps these “special” items aren’t at the same level as the thousands of other obscure details??

These topics are typical of QQ i.e. tough topics for IV only, not tough topics for GTD.

  • archetypical: which socket syscalls are blocking and when
  • some of the hundreds of g++ options?
  • $LD_LIBRARY_PATH
  • hash table theoretical details? too theoretical to be part of this discussion
  • select() syscall
  • vptr

t_ivTechnicality is the tag

##y MultiCast favored over TCP

Reason: data rate constraints inherent in TCP protocol. Congestion Control?
Reason: TCP to a large group would be one-by-one unicast, highly inefficient and too much load on the sender. Reason: TCP has more data-overhead in the form of non-payload data. * TCP header is typically 20 bytes vs 8 bytes for UDP
* Receiver need to acknowledge

q[g++ -g -O] together

https://linux.die.net/man/1/g++ has a section specifically on debugging. It says

GCC allows you to use -g with -O

I think -g adds additional debug info into the binary to help debuggers; -O turns on complier optimization.

By default, our binaries are compiled with “-g3 -O2”. When I debug these binaries, I can see variables but lines are rearranged in source code, causing minor problems. See my blog posts on gdb.

前辈civil engineer^old C programmers

Opening example — I shared with Wael… If you meet a regular (not a specialist) civil engineer aged 50, you respect and value his skills, but what about a C programmer of the same age? I guess in US this is similar to a civil engineer, but in SG? Likely to be seen with contempt. Key is the /shelf-life/ of the skill.

Look at Civil engineers, chemical engineer, accountant, dentists, carpenters or history researchers (like my dad). A relatively high percentage of those with 20Y experience are 前辈. These fields let you specialize and accumulate.

In contrast, look at fashion, pop music, digital media… I’m not familiar with these professions, but I feel someone with 20Y experience may not be 前辈. Why? Because their earliest experiences lose relevance like radioactive decay. The more recent the experience, the more relevant to today’s consumers and markets.


Now let’s /cut to the chase/. For programmers, there are some high-churn and some “accumulative” technical domains. It’s each programmer’s job to monitor, navigate, avoid or seek. We need to be selective. If you are in the wrong domain, then after 20Y you are just an old programmer, not a 前辈. I’d love to deepen my understanding of my favorite longevity[1] technologies like

  • data structures, algos
  • threading
  • unix
  • C/C++? at the heart or beneath many of these items
  • RDBMS tuning and design; SQL big queries
  • MOM like tibrv
  • OO design and design patterns
  • socket
  • interactive debuggers like gdb

Unfortunately, unlike civil engineering, even the most long-living members above could fall out of favor, in which case your effort doesn’t accumulate “value”.

– C++ is now behind-the-scenes of java and c#.
– threading shows limited value in small systems.

[1] see the write-up on relevant55

–person-profession matching–
A “accumulative” professional like medical research can 1) be hard to get in and 2) require certain personal attributes like perseverance, attention to details, years of focus, 3) be uninspiring to an individual. Only a small percentage of the population get into that career. (Drop-out rate could be quite high.)

For many years in my late 20’s I was completely bored with technical work, esp. programming, in favor of pre-sales and start-up. But after my U.S. stay I completely changed my preferences.

mkt-data tech skills: !!portable !!shared

Raw mkt data tech skill is better than soft mkt data even though it’s further away from “the money”:

  • standard — Exchange mkt data format won’t change a lot. Feels like an industry standard
  • the future — most OTC products are moving to electronic trading and will have market data to process
  • more necessary than many modules in a trading system. However ….. I guess only a few systems need to deal with raw market data. Most down stream systems only deal with the soft market data.

Q1: If you compare 5 typical market data gateway dev [1] jobs, can you identify a few key tech skills shared by at least half the jobs, but not a widely used “generic” skill like math, hash table, polymorphism etc?

Q2: if there is at least one, how important is it to a given job? One of the important required skills, or a make-or-break survival skill?

My view — I feel there is not a shared core skill set. I venture to say there’s not a single answer to Q1.

In contrast, look at quant developers. They all need skills in c++/excel, BlackScholes, bond math, swaps, …

In contrast, also look at dedicated database developers. They all need non-trivial SQL, schema design. Many need stored procs. Tuning is needed if large tables

Now look at market data gateway for OPRA. Two firms’ job requirements will share some common tech skills like throughput (TPS) optimization, fast storage.

If latency and TPS requirements aren’t stringent, then I feel the portable skill set is an empty set.

[1] There are also many positions whose primary duty is market data but not raw market data, not large volume, not latency sensitive. The skill set is even more different. Some don’t need development skill on market data — they only configure some components.