Press "Enter" to skip to content

Anticipating problems

whoa hello good Monday morning it’s
really really bad weather here I intend
to fall multiple times in this during
the making of this video
either way during this Jesus my feet are
not wet today I would like to talk to
you and use a little bit about the
concept of accepting risk in software
development and in life in general the
concept of accepting that failures might
happen and that we allow a little a
little bit more risk of failure rather
than trying to anticipate everything
that can go wrong and and act
defensively so a little bit more being a
little bit more aggressive in a little
bit more fearless when it comes to
development then have more of an
experimental mindset rather than a
defensive one so I was having this
discussion with my girlfriend ISA who is
also a software developer around this
thing this concept that you sometimes
run into the situation that you
sometimes run into in software
development where you’re about to do a
solution to some kind of problem you
know implement some library or start
doing things in a new way or like
implementing this new framework or
starting to add like networking holes in
your app but at point and someone on
your team or your software that matter
starts you know
going down this path of what if like
what if this makes the application
heavier what if this makes the what if
we are becoming dependent on this
open-source project that is not not
being maintained correctly you know
stuff like that basically thinking about
don’t you think about contingencies and
yeah of course there’s nothing wrong in
itself around thinking about things that
might go wrong it’s it’s fine to analyze
the situation and and consider consider
things that might be threats that’s fine
but it might also put you in this state
of analysis paralysis and and and fear
and what’s what’s the the core problem
that I want to discuss here is that we
as humans have the utmost confidence I
argon Atmos
over confidence in our ability to
anticipate problems we we basically
think that we’re psychic and that is why
psychics can sell their services really
because humans think that we are super
good at at predicting the future and
then that some people are and that we
ourselves are by the way there’s a sauna
over there it’s pretty cool you can see
it now but people are skinny-dipping in
it and you have probably it’s seen this
in life many times over like in your not
in your necessarily in your software
developer life but it life in general
that you’re about to do some big change
move to a new city moving to a new
get a new computer switch to from
Android to iPhone whatever like might be
many things and before you do your brain
tends to start trying to figure out
things that you that might be problem
some threats that you’re anticipating so
to speak and in its I don’t know about
you but for me it’s very rare that the
things that I worried about were the
things that actually came to be not that
problems didn’t arise they certainly did
it’s just that the problem that arose
were completely different from the ones
I anticipated and that is the like the
core problem with this whole thing of
trying to anticipate and anticipate
threats is that your ability to
accurately do it is so incredibly
limited and the reason for this is that
my theory is that our brains were
evolved during in the era long long long
long time ago where we were basically
trying to plan for pretty simple
contingencies of high risk so for
instance like Tigers hiding in the
bushes maybe so we avoid bushes Tigers
don’t hide in bushes but you get my
point or things like food is going to
run out because our crops will stop
growing in the winter so we need to
stockpile stockpiling is something a lot
of mammals do it’s a pretty simple thing
but it’s like that’s one of the things
that we use our brains to do like these
threats assessments or I don’t know like
it’s the ice thin perhaps we shouldn’t
walk on the ice because it might break
you know and what all these old threats
of the old world
is that they were low in complexity so
they were easy to reason about and they
were high in risk so we were very very
mindful of them and we consider like
paid a lot of attention to these
possible threats because like we would
die if if they came to be but the
problem with using a brain which is
built for this and apply it to something
it’s like software development or
something at like moving cities or
anything really in life in our high
complex high high speed world yes that
the brain is built for low complexity no
low complexity does of high risk while
software threats are high complexity and
low risk if they are really really hard
to predict there are so many moving
parts and if if things break yeah of
course it’s bad like we’re gonna lose
revenue because our customers are gonna
be angry with us perhaps like we’re
actually gonna lose money in a real way
and people are perhaps like like a worse
going to get fire but generally unless
you’re doing some kind of medical or
medical software or running a nuclear
power plant or something like this in
which case this episode does not apply
really then then your tasks like the the
threats are going to be pretty pretty
low risk compared to people dying this
is of course nothing new that I’m saying
here like a lot of software developers
are yapping on about it all the time and
I’m gonna Yap about run on about it
today as well because it’s because it’s
a way that our brains are built
it’s a bug in our brain so we need to be
constantly reminded about it and other
people have coined phrases like
premature optimization is the root of
all evil
or Facebook for a while use the term
move fast and break things to get you
out of this mindset and premature
optimization is the root of all evil
talks about like when you start Jesus
Christ start doing optimizations of your
code like oh my god this piece of code
is so inefficient but in reality the
problem is that that code is only hit a
couple of times per second which doesn’t
really affect the performance of your
application at all while another part of
the code that looks or is a way more
efficient is hit like four thousand
times a second and that’s the part that
needed optimization but you didn’t
really know that until the point like
the the application was in production or
until you actually started like
profiling it on on actual users using
the actual software and the reason why
you didn’t see it is that the system or
the yeah the system it’s just so complex
it’s your brain is built for like yeah
crops and tigers and bushes and
like it cannot really like ink do deal
with a cognitive load that it would need
to that would be needed to predict and
anticipate problems in a software system
and that is why the phrase premature
optimization is the root of all evil and
doesn’t mean that optimization is bad it
means that you should be wary of doing
it early and try to like analyze instead
of predict and lyse instead of
predicting why do I have a look at how
cool this is like love like these old
boats that with all kinds of mysterious
so what what I want to get had here
is that your initial impulse in in
situations of
where there’s a lot of unknowns is to
try to anticipate problems and threats
but your brain is built for a low
complexity high danger world but well
we’re in is super high complexity
complexity and rather low danger come
compared to what your meat computer is
it’s tuned for so you’re going to give
too much weight to the threats but most
importantly and and what what I is
really the important thing in this this
case is that you simply cannot predict
what’s gonna happen you’re gonna make
incorrect predictions the threats that
you’re gonna anticipate are probably not
going to happen what’s going to happen
is completely different things and it’s
those completely different things the
unknown unknowns that I would like to
that I would like to focus on because if
we have these unknowns that we cannot
possibly predict no matter how much free
analysis we try to make and how much we
no matter how much we try to think about
the problems up front before we get into
things if we always have these unknowns
and then what to do well you just have
to be very good at receiving problems
like catching them like the you’re going
to life is gonna throw you curveballs
all the time I love Murphy’s Law that if
something can go wrong it will go wrong
and I would like that I would like it to
have this I Edition that you you also
well not know what God it goes wrong
it’s expressions like shits when
hits the fan
there’s always there like when it’s not
if hits the fan
shits gonna happen
and you need to be good at dealing with
that as it comes along
so instead of trying to anticipate
everything single thing that can go
wrong when you’re doing software
development it’s it’s very important to
accept that will go wrong and you
need to be very capable of dealing with
it when it does so having a fast
deployment process having a ton of error
logging and exception logging having
like a quick turnaround cycle so if you
like if you if it takes you two weeks to
get out release out then you are simply
not gonna be able to you’re not gonna be
especially good at dealing with these
these unanticipated problems and the
problem is when they happen you cannot
get stuck in this like oh we didn’t
think about this this hard enough we
didn’t think about everything that can
happen why did anybody think of this no
you didn’t think of it because the the
product it’s software development is too
high complexity for you to think of it
the problem is not that you were care
not careful enough the problem is that
you you are not flexible enough to fix
as it it comes along and and I
think this is also true of real life
that it’s worry is not a completely
useless feeling it might be great but
often you spend your time worrying too
much about contingencies and instead you
should just accept that will happen
and that you just have to develop
yourself to be a capable person that can
deal with life’s hardships ask as they
arise instead of walking around in fear
you you you’re gonna have to walk very
very very slowly and in this world it’s
problems gonna hit you either way and
that’s it that’s my thoughts on that
what do you think write a comment down
below what are your experiences would
risk and and failure and survival and
threats in software development and
analyzing before like your balance
between analyzing upfront or analyzing
afterwards and dealing with failure
stuff like that tell me your thoughts I
would love to hear them comment down
below the sponsor of today’s episode is
you as you might know this this show is
mostly funded by you the patrons so if
you are a patron or phone phone function
you get access to the Fun Fun forum
which is a this little place where you
can privately discuss matters in
software development me and your fellow
developers your fellow patrons is a
really nice little community that I’m
very very proud of if you want to
support the show and get access to that
you can go to and in
case you’re completing you to the show
you have just watched an episode of fun
fun function I release these every
Monday morning Oh 800 GMT but you will
forget that so you can subscribe by
clicking there or watch another episode
right now by clicking there I am mpj and until next Monday morning stay curious
Please follow and like us: