Archive for January, 2010
So I periodically dabble with my hifi setup. I rearrange stuff, I recalibrate my reference levels at the sitting positions using an inexpensive sound pressure level meter, and measure the distances using a cool laser distance meter.
I ignore my acoustics engineer self (that left in place by 10 years of SONAR system engineering) that is screaming (Edvard Munch style) at the banality of my exercise.
My engineer self does have a point: My SPL meter, for example, is a cheapo Radio Shack SPL meter. It measure signal in decibels. But a decibel is a ratio between two numbers: a reference figure, and a measured value. For example, a good measurement would be 12 dB re 1 uPa @ 1m which mould mean that my signal was 12 decidel relative to a pressure wave of 1 micro Pascal (pressure) as measure 1 meter from the source. My practical self dismissed my engineer self by saying “it is all relative anyway, so the exact parameter of the measurement is not important”; to which my engineer self scoffs with a resounding “idiot! If you don’t understand what you are measuring then anything that you measure is suspect. For example, your rear speakers naturally have a different freq response than your front speakers. Hence if you try to balance them using the SPL meter, and you don’t *really* understand how it sums the SPL throughout the frequency range, you might get inconsistent results. This will also be true due to difference in the vertical response of the sepakers vis-a-vis your sitting position“. Now once in a while my engineer self nearly gets a sure footing and I trend precipitously close to acquiring a Bruel & Kjaer measurement system so I can start measuring with aplomb. I usually luck out by ending up reading some article I find somewhere instead of paying the requisite megabucks for B&K Uber Gerate.
So here is a question: Most of us know that there is a hifi market denoted as “Audiophile”. There is also a market called “professional audio”. There are very few brands that cater to both (I can only think of Dynaudio, Bryston, JBL, ADAM Audio, PMC, JM labs and a few others) and many of the products are so labelled (pro audio vs. home audio). Now audio is audio – why is there such a distinct seperation between the two markets?
Here are my opinions:
1. It isn’t looks (so called Wife Acceptance Factor – WAF) – Some home audio stuff is as horrid looking as the most functional of pro audio devices. And some pro audio stuff is drop dead gorgeous.
2. It isn’t pricing – Some pro audio stuff is as expensive as audiophile stuff. Even though it is easier to justify the really upper end stuff for home use (the justification is based on expendable income, just like an ultra high-end stovetop for people who only cook steaks, more than any value statement) – it isn’t really necessary for a recording studio.
3. Objective vs. Subjective sensibilities – By far the biggest differentiator – Audiophile makers differentiate themselves by ratings, by reviews, and mostly by subjective assessments. Audio professionals look for objective assessments (impossible to do, but possible to try to achieve). In fact many Audiophiles disregard objective assessments (like measurements) as secondary to subjective assessments (like listening to their favorite CDs). Meanwhile pros (like audio designers) measure first, and then validate the measurement with listening tests (to ensure they haven’t a “lemon”).
Note: The audio engineering market is exceptionally mature. So I have to accept the fact that both approaches have their merit. Audiophiles indeed have to rationalize their choices - and subjective assessments are the most optimal way to rationalize a choice, especially when there is no concensus on “state-of-the-art”. Meanwhile, audio pros have to make rational choices – for example, unlike an Audiophile, they must have a perfectly flat frequency response otherwise their recordings will be equalized to compensate and tend to sound “off” on other equipment. This might be interpreted by their customers as a quality deficiency resulting in fewer, lower paying projects. So both approaches are the correct approaches for their market segments.
What has this got to do with security? Well, security is just like any other market. It has the customers that rationalize their decision, and it has customers that make rational decisions. Now here is the funny fact-of-life: customers in the latter group tend to be assured with their decision and can defend it reasonably well, while customers in the former group tend to hem-and-haw and sort themselves into religious-like user camps. Just like the Audiophiles who flock to like-minded rationalization groups (like the sound-of-wire vs. all wires are identical camp, the Single Ended Triode vs. push-pull camp, the record player vs. CD or BD camps, the solid state vs. vacuum tube camps etc.).
The rational thinking “objective” group (typically early adopters) work like entrepeneurs: They identify a problem, create a list of parameters for their problem, and search for solutions. The decision rationalizing “subjective” group works in other ways, for example by stating top-level decision criteria inconsistent with the problem scenarios.
As an example, to compensate for their inability to achieve a sensible technical decision – or even a sensible description of the problem they are trying to solve – they will choose on other parameters – like integration with other products – whether those integrations make sense or not – or based on analyst opinion, or past relationships, or a reference list, or even past lust (or current bedroom relationship).
This is the “high end” model based on perception of applicability vs. measured applicability to the problem. Security folk are especially prone to this style of analysis since their role is multi disciplinary. The DLP market is becoming the best example of how this multi disciplinary responsibility serves to undermine the decision process eventually resulting in an alarming number of failed projects. For example, assume a security person who came from networking. Their background is reviewing logs, identifying the patterns of malware and they have a keen understanding of exploits. Being the best on their team, they are invited to participate in a DLP project selection committee. What, within their experience, allows them to understand the nature of risk due to information exposure? Not much… For the majority of technical security experts, the meaning of risk (and methodologies to assess and minimize risk) is obtuse. What is worse, risk is the sort of variable that everyone thinks they know and very few actually do. Even banking risk departments, who are supposed to be the leaders in risk class assessments, proved that they had no clue a year or so ago when they piled high risk products into lower risk bundles – just ask any jet airliner designer how wrong that assumption is.
Similarly, consider a CISO. Predominantly a business title, how is a CISO to assess the technical capabilities and applicability to the network of a DLP solution? A good CISO is ill equipped to provide a concrete technical answer to the question of technical suitability.
Add to this equation the fact that business folk and technical folk might as well speak a different language alltogether, and you are left with dire prospects for your selection committee.
This is where the analogy between the Audio Market and the Security Markets ends. An amplifier is an amplifier. It might amplify differently. But all amplifiers, and especially at the high-end side of the market, do a reasonable job of amplification. Almost all pro models are identical. That is the safety of a mature market. But the security market, by its nature, will never mature. Hackers and thieves will ensure that whatever we purchase today will be outdated quickly (as quickly as they can write the scripts to make it outdated). The results of the emotional decisions, in an immature market can be disaster. Remember the sods who bought the original early day $15-25k hi-def plasma displays only to have them become obsolete within 2 years due to the emergence of copy protection (HDCP)?
So 2 years later and the committee finally realize that while they really needed an equivalent of a pickup truck they had mistakingly acquired a dragster. It couldn’t pull the weight of the problem, it was hard to control and it tended to periodically veer off into the ditch. They hired a team of 100 to rebuild the engine every Tuesday and Thursdays. And you needed a semi-trailer to haul the damn thing around.
But at least they purchased “high end”. Colorful, shiney, heavy and what a guilt trip (as well as sometimes career limiting). As one CISO put it to me, it is “the cost of a maturing security organization”.
Back to my speakers. Radio Shack SPL meter useless in calibrating sub level (due to inconsistencies in frequency reponse). Damn it. Perhaps it is time for a B&K measurement station? Gotta love those Danes for their perfect measurement stuff. $5k – Eh? Nah. ETF 5 and a somewhat calibrated Behringer mic (50$) is all I really need.
While Sharon is busy waddling knee-deep through Phy (layer 1) terminology, another hardware/lifestyle company has released its gigantic equivalent of their iPod product, named iPad.
Somehow the glitter of lower tech color LCD screens has been noisier than Sharon’s uber technology switches. Go figure.
While I’m waiting for Assaf to create the next-gen soup, here is the partial list of abbreviations and their meaning I have to learn. In some cases, it takes me 20+ years in time…
- AUI : Attachment Unit Interface, originally connected to a MAU
- MAU : Medium Attachment Unit, like a 10Base-2 transceiver.
- XAUI : A 10G AUI, the X is the Roman numeral for 10; Data path is 4×3.125Gbps Lanes
- XLAUI : A 40G AUI, XL being the Roman numeral for 40; Data path is 4×10.3.125Gbps Lanes
- CAUI: A 100G AUI, C (you guessed it) being the Roman numeral for 100; Data path is 10×10.3125Gbps Lanes
- MII : Medium Independent Interface, 4bit wide data path.
- RMII : Reduced MII, the MII but with less signals!
- SMII : Serial MII, the data path is reduced to one bit.
- GMII : Gigabit MII, 8bit wide data path.
- RGMII : Reduced Gigabit MII.
- SGMII : Serial Gigabit MII.
- XGMII : 10G MII (this time the G made it in).
- XGXS : XGMII eXtender Sublayer.
- XLGMII : 40G MII.
- CGMII : 100G MII.
- MAC : Media Access Controller.
- PLS : Physical Layer Signaling; for 10Mbps only, implemented the Manchester encoding.
- RS : Reconciliation Sublayer.
- PCS : Physical Coding Sublayer; e.g. 8B/10B.
- MLD : Multi Lane Distribution
- PMA : Physical Medium Attachment.
- PMD : Physical Medium Dependant.
- IPG : Inter Packet Gap; Code words sent between valid Ethernet Frames.
First, let me start stating that this is NOT a security issue with Google, even though it might be presented this way.
Unless you were hiding in a cave in the past hours you know that Google is taking some serious steps to protect its customers (you, me, all of us) after it was attacked one more time (see ”Google on the defensive, vulnerable; China risks international and U.S. response“). Among other things, “Google Finally Improves Security of Gmail Connections as Consumer Watchdog Urged” which is great:
Consumer Watchdog said Google should use encryption for connections to all its Internet-based services, not just Gmail.The new security measures would not have prevented the sort of cyber attack that targeted Google from China. It does increase security to prevent third parties from snooping as information moves from a computer over a network to Google’s servers. Google has offered SSL encryption using the https protocol as an option since 2008
But if you look on the the screenshot you can see that NOT all the traffic is encrypted… While this might be OK for static pages, who knows what other pages are not protected with SSL? Why can’t you turn it on for the entire site? It will add more credibility and assurance…
Victor Stampfer knows something about cooking. At the end of his long book he lists this knowledge clearly and articulately:
Cellulose hydrolysis above 120C
Cooking vegetables to hydrolysis pectin and starch > 85C
Pectin Hydrolysis 85C
Cooking vegetables to hydrolysis (starch only) 85C > > 80C
Starch solubilization and hydrolysis 80C
Myofibrilar proteins and myoglobin alteration (loss of waterholding capacity; color is definitely altered) 68C
Cooking of –
– braized, sauteed, steamed or boiled meats
– roasted white meat
Sarcoplasmic protein alteration (modification of the perception of color) 62C
Cooking of –
– just cooked fish < 62C
- red, roasted or grilled meats <62C
- rare 56-58C
- medium rare 58-60C
- medium well 60-62C
- well done > 62C
Beginning the destruction of vegetative forms of bacteria 52C (careful of spores!)
Bacterial growth, spore germination < 52C
It is really all you need to know to cook some of the best meals of your life.
Since the Fluffinator is a very geeky coffee project of mine, I have posted it on home-barista. Those of you who happen to own a Mazzer Mini E grinder and are somewhat disappointed with the quality of its grind will find it very useful.
All the rest of you won’t, but may appreciate the level of myopic focus invested on diminishing returns exhibited by the people involved. And then some of you will disapprove and comment on the injustice of it all and that half of the world is hungry. Well, that is the half that picks the coffee that both you and I drink. So there – life is not fair.