Test Driven Development (TDD)
Test, Code, Refactor https://en.wikipedia.org/wiki/Test-driven_development
- Lean: Identify Value -> Breakdown Steps -> Continuous Flow -> Reduce Waste
- Lean Startup: Measure, Learn, Build http://theleanstartup.com/principles
Better than Free
- social proof
Competition and Profit
- Threat of new competition (barriers to entry, customer loyalty, desirability of that industry/biz model)
- Threat of substitute products or services (switching costs, quality, compatibility)
- Bargaining power of customers (switching costs, market options, dependency for other services)
- Bargaining power of suppliers (switching costs, supplier choice, supplier becoming competitor)
- Intensity of competitive rivalry (innovation, branding, economies of scale)
Schelling's segregation model
Micromotives and macrobehavior , https://en.wikipedia.org/wiki/Thomas_Schelling#Models_of_segregation
Granovetter threshold model for peer effect on collective behavior and Strength of Weak Ties (aka how LinkedIn gets you a new job), https://en.wikipedia.org/wiki/Mark_Granovetter
- Bandwagon Effect (Groupthink)
- Confirmation Bias: search for and interpret information and memories that support preconceptions
- Gambler's Fallacy: future probabilities are affected by previous outcomes
- Negativity Bias: paying more attention to bad news
- Neglect of Probability: disregarding probabilities when making a decision (risk of flying versus driving)
- Observational Selection Bias (Frequency Illusion): noting something previously ignored results in a misconception that it has increased in frequency
- Projection Bias: wrongly presuming others think like us
- Status Quo Bias: things should stay the same
Logical Fallacies and Disinformation
- Appeal to probability: because something could happen it is inevitable that it will happen (see Gambler's Fallacy and Neglect of Probability)
- Silence, Indignant, Rumors, Straw Man, Ad Hominem, Hit and Run, Question Motives, Invoke Authority, Play Dumb, "That's old news", Confess to a lesser item and "come clean", Enigma, Rube Goldberg Logic, Demand a complete solution, Fit the facts to alternate conclusions, Remove witnesses/evidence, Change the subject, Antagonize, Ignore proof and demand impossible proof, False evidence/facts, Loudly call for a separate investigation (ideally either biased or with confidential findings), Manufacture a new truth, Larger distractions, Silence critics, Lie low
Keys to being successful
- organize (categorize)
brainstorm goals and feelings: "say what you're going to do and then do what you say"
Concentrate, Iterate, Automate, Validate, Appreciate
- Software version: core tech strengths & problem, quick releases, automate, test!, style? + recognize the contributions
- Military version: core strength and enemy weaknesses, rapid short executions, make excellence a reflex, check for brittleness, engender loyalty
7 habits of highly effective people
- be proactive
- "begin with the end in mind" (envision the goal)
- "put first things first" (order and prioritize)
- "think win-win" (good outcomes for everyone)
- "Seek First to Understand, Then to be Understood" (listen, then persuade)
- "synergize" (teamwork)
- "sharpen the saw" (sustainable balance)
Maslow's Hierarchy of Needs
The lowest levels of the pyramid must be satisfied before people can focus and succeed at higher levels.
--------Esteem-------- -----Love/Belonging----- ---------Safetey---------- -------Physiological--------
Important Software Concepts
- Do Not Repeat Yourself (DRY)
- Model View Controller (MVC)
- Atomic Consistent Isolation Durability (ACID)
- Abstraction Polymorphism (overloading, inheritance, overriding interface) , Inheritance , Encapsulation
- Consistency Availability Partition tolerance vs Basically Available Soft-State with Eventual consistency
- Nondeterministic Polynomial ... NP-hard http://en.wikipedia.org/wiki/NP-hard
- NP-complete (subset sum problem can be verified) http://en.wikipedia.org/wiki/NP-complete
- co-NP (verifier of "no" answer")
- A system cannot be sped up by parallelization more than the inherently serial steps http://en.wikipedia.org/wiki/Amdahl%27s_law
- So benchmark your system, then determine what parts can be parallelized and how much that will improve the result and how much will it cost to do so
- The system design produced by an organization will reflect the organization's communication structure. http://www.melconway.com/Home/Conways_Law.html
- Possibly disastrous results when combined with Groupthink http://en.wikipedia.org/wiki/Groupthink
- Commonly referred to when considering how adding a new person or new team to organization will affect productivity
- Adding resources (people) later in a project will make it even later http://en.wikipedia.org/wiki/Brooks%27s_law
- A decent observation given the above "laws": if a task has serial parts adding people (parallelization) will not speed it up AND every person will have to interface
- Computing power will double (or become cheaper by half) every two years http://en.wikipedia.org/wiki/Moore%27s_law
- Sustained in part by improvements in complimentary technologies like Memory, Storage, Cooling, etc.
- At a certain point in the future potentially only possible using parallel computing but with an increased coordination cost (including software that leverages parellization)
Be conservative in what you do, be liberal in what you accept from others
Laws of Unix
- Modularity: Write simple parts connected by clean interfaces.
- Clarity: Clarity is better than cleverness.
- Composition: Design programs to be connected with other programs.
- Separation: Separate policy from mechanism; separate interfaces from engines.
- Simplicity: Design for simplicity; add complexity only where you must.
- Parsimony: Write a big program only when it is clear by demonstration that nothing else will do.
- Transparency: Design for visibility to make inspection and debugging easier.
- Robustness: Robustness is the child of transparency and simplicity.
- Representation: Fold knowledge into data, so program logic can be stupid and robust.
- Least Surprise: In interface design, always do the least surprising thing.
- Silence: When a program has nothing surprising to say, it should say nothing.
- Repair: Repair what you can, but when you must fail, fail noisily and as soon as possible.
- Economy: Programmer time is expensive; conserve it in preference to machine time.
- Generation: Avoid hand-hacking; write programs to write programs when you can.
- Optimization: Prototype before polishing. Get it working before you optimize it.
- Diversity: Distrust all claims for one true way.
- Extensibility: Design for the future, because it will be here sooner than you think. (Or, to put it another way, your creations will last longer than you think!)
- Source Code is for humans, make it easy to read and understand
- The code is the authoritative source (comments add context)
- Leave the campground cleaner than you found it
- Tests reveal what the code outputs; clean code runs all of the tests
- Meaningful Names
- Functions: A minimum number of parameters and the smaller the better
- Open - Close principle
- Single Responsibility (do one thing, and do it well)
- No Duplication (DRY)
- Objects allow modularity, Boundaries keep you sane
- Separate Constructing a System from Using it (and Initialization from Runtime)
Fallacies of Distributed Computing
- The network is reliable.
- Latency is zero.
- Bandwidth is infinite.
- The network is secure.
- Topology doesn't change.
- There is one administrator.
- Transport cost is zero.
- The network is homogeneous.
How Complex Systems Fail
- Complex systems are intrinsically hazardous systems
- Complex systems are heavily and successfully defended against failure
- Catastrophe requires multiple failures – single point failures are not enough
- Complex systems contain changing mixtures of failures latent within them
- Complex systems run in degraded mode
- Catastrophe is always just around the corner
- Post-accident attribution accident to a ‘root cause’ is fundamentally wrong
- Hindsight biases post-accident assessments of human performance
- Human operators have dual roles: as producers & as defenders against failure
- All practitioner actions are gambles
- Actions at the sharp end resolve all ambiguity
- Human practitioners are the adaptable element of complex systems
- Human expertise in complex systems is constantly changing
- Change introduces new forms of failure
- Views of ‘cause’ limit the effectiveness of defenses against future events
- Safety is a characteristic of systems and not of their components
- People continuously create safety
- Failure free operations require experience with failure
Why do computers stop and what can be done about it
Dickerson's Hierarchy of Reliability
Product Development Capacity Planning Testing and Release Procedures Postmortem and Root Cause Analysis Incident Response Monitoring
Designers/Creators of Programming Languages
|Python||Guido van Roosum||1991||https://en.wikipedia.org/wiki/Python_(programming_language)|
|Go||Robert Griesemer, Rob Pike, Ken Thompson||2009||https://en.wikipedia.org/wiki/Go_(programming_language)|
A quick history of software (in ascii)
hardcoded hardware (ENIAC) -> von neumann architecture (stored programs) -> mainframes with custom punch cards (assembly) -> procedural code (fortran, c) -> object oriented (simula, java) -> parallel programming -> Artificial Intelligence that writes self adapting Domain Specific Langauges for everything?
Start by reading all of the following to nitpick how the above is fast and loose with history and the truth...