Midsummer reading – stumbling on happiness

I picked up this book to get some new perspective on research, work-life balance and, eventually, happiness. Not that I’m miserable, but I got intrigued by the recent developments in psychology and I wanted to take this as a bedtime reading. Midsummer reading, to be exact.

Well, the book is a great literature for that, no doubt about it. I like the style of the author and how he takes on examples. I also found that the book has two chapters about metrics and measurement. In chapter 2 and 3, the authors discusses our view on the science behind happiness and the fact that it’s very difficult (not impossible) to measure. There are ways to measure happiness, or estimate it.

What I like about the author’s approach is that he uses these measures to show temporal aspects of happiness – our estimations about how happy we will be, our happiness at the moment and finally our happiness after a while.

To sum up, the main point of the book is that happiness is what we create, not what we get.Just by changing the way we see things or how we compare things, can make us more happy.

How do you know if you are disrupted?

Or how do the banks know that…

Recently I’ve read a very interesting book about the disruption that happens in the banking sector. I’ve learnt that this is not the first book about the topic and I wanted to understand how things are actually working with AI and the banking sector.

We’ve published a paper a while back about the introduction to AI for bankers, but I wanted to know more about what has happened since then. The link to the paper is here: https://doi.org/10.2308/jeta-19-04-30-21

So, what’s new about this particular book? Or what’s new about the disruption?

The book talks a lot about the financial sector and how it evolved over the years. Bank 4.0 is a bank which is powered by AI and is able to reason about your economy at a higher abstraction level. Instead of asking “Siri, how much money do I have on my account?”, we can start asking questions like “Siri, can I affort buying the new Playstation 5?” and the assistant will answer that you can, but then you need to cut down on your vacation expenses and maybe replan the purchase of the new mobile phone (which is getting old).

A small disclaimer here: I use Siri as example, this has nothing to do with Apple services (at least nothing that I’m aware of).

Another interesting aspect of banking 4.0 is the fact that financial actors are more trustworthy than the banks. The young generation would rather interact with the actors like WeChat (https://metrics.blogg.gu.se/?p=407) than interact with the bank.

However, what I like best about the book is the fact that it problematizes the concepts related to disruption – for example how do you know that you are being disrupted? or when should a company start pivoting, or even whether pivoting or abandonment of the old model is possible in the company that is being disrupted.

The truth… or how things can be untrue

Data veracity is a concept where we define the degree to which data corresponds to the true values. It comes from the metrological concept of “measurement trueness”, which is the degree to which the measurement quantifies the value correctly.

Well, that sounds very simple, but it is in fact quite complex. In our previous work, we scrutinized what it means to have veracious data in transport systems (https://ieeexplore.ieee.org/abstract/document/7535482). It turns out that “lying” is not the only option here.

In this book, the author looks into the way how things can be untrue. Sometimes deliberately by lying, sometimes by mistake. Sometimes, as we learn in the last chapter (with Brazilian aardvark), a mistake can actually end up being accepted as truth over time.

I recommend the book as it is written in a fantastic manner, providing examples from the real world (e.g. the alleged drone sightings over Gatwick in 2018). It even goes a bit further and discusses the need of replication of studies and that we should get more funding for making the scientific results more solid and robust.

Developing sustainable software engineering programs…

This week I had a chance to present our experiences from building a sustainable software engineering program (MSc) at University of Gothenburg.

The talk was given at the SANORD symposium at Karlstad University.

The link to the talk is here: Presentation (PDF)

Abstract:
Software Engineering is one of the newest engineering fields with a growing need from the society side. The field develops rapidly which poses challenges in developing sustainable software engineering education – allowing the alumni to be effective in their work over a long period of time (long-term impact of the education) and keeping the education attractive for the potential students and industry.

The objective of this presentation is to describe the experiences from using business intelligence methods to develop, profile and monitor software engineering education on the master level. In particular we address the following research questions:

    • Which data sources should be used in developing a profile of a master program?
    • How to combine, prioritize and communicate the analyses of the data from the different sources?
    • How to identify barriers and enables of attractive sustainable software engineering education?

The results are a set of experiences from using data from the national agencies in Sweden (e.g. the Swedish Council for Higher Education – UHR, the Swedish job agency – Arbetsförmedlingen, international master education portals – mastersportal.eu) as input in development and evaluation of a master program in Software Engineering at University of Gothenburg.

The conclusions show that using the available sources lead to creating sustainable programs and we recommend using the data sources to a larger extent in the national and international level.

Do SysML requirement diagrams help?

Today I’ve had a privilege to present a paper at EASE 2014 done in collaboration with University of Basilicata in Italy.

Link to presentation

The paper is an experimental validation of whether requirement diagrams speed up the understanding of requirement specifications or whether they increase/decrease comprehension. The results show that the comprehension is increased while there is no change in time.