[Context and motivation] Internet of Things (IoT) is becoming common throughout everyday lives. However, the interaction is often different from when using e.g. computers and other smart devices. Furthermore, an IoT device is often dependent on several other systems, heavily impacting the user experience (UX). Finally, the domain is changing rapidly and is driven by technological innovation. [Question/problem] In this qualitative study, we explore how companies elicit UX requirements in the context of IoT. A key part of contemporary IoT development is also data-driven approaches. Thus, these are also considered in the study. [Principal idea/results] There is a knowledge gap around data-driven methodologies, there are examples of companies that collect large amount of data but do not always know how to utilize it. Furthermore, many of the companies struggle to handle the larger system context, where their products and the UX they control are only one part of the complete IoT ecosystem. [Contribution] We provide qualitative empirical data from IoT developing companies. Based on our findings, we identify challenges for the companies and areas for future work.
Video game development is a complex endeavor, often involving complex software, large organizations, and aggressive release deadlines. Several studies have reported that periods of “crunch time” are prevalent in the video game industry, but there are few studies on the effects of time pressure. We conducted a survey with participants of the Global Game Jam (GGJ), a 48-hour hackathon. Based on 198 responses, the results suggest that: (1) iterative brainstorming is the most popular method for conceptualizing initial requirements; (2) continuous integration, minimum viable product, scope management, version control, and stand-up meetings are frequently applied development practices; (3) regular communication, internal playtesting, and dynamic and proactive planning are the most common quality assurance activities; and (4) familiarity with agile development has a weak correlation with perception of success in GGJ. We conclude that GGJ teams rely on ad hoc approaches to development and face-to-face communication, and recommend some complementary practices with limited overhead. Furthermore, as our findings are similar to recommendations for software startups, we posit that game jams and the startup scene share contextual similarities. Finally, we discuss the drawbacks of systemic “crunch time” and argue that game jam organizers are in a good position to problematize the phenomenon.
Software engineering is at the core of the digitalization of society. Ill-informed decisions can have major consequences, as made evident in the 2017 government crisis in Sweden, originating in a data breach caused by an outsourcing deal made by the Swedish Transport Agency. Many Government Agencies (GovAgs) in Sweden are rapidly undergoing a digital transition, thus it is important to overview how widespread, and mature, software development is in this part of the public sector. We present a software development census of Swedish GovAgs, complemented by document analysis and a survey. We show that 39.2% of the GovAgs develop software internally, some matching the number of developers in large companies. Our findings suggest that the development largely resembles private sector counterparts, and that established best practices are implemented. Still, we identify improvement potential in the areas of strategic sourcing, openness, collaboration across GovAgs, and quality requirements. The Swedish Government has announced the establishment of a new digitalization agency next year, and our hope is that the software engineering community will contribute its expertise with a clear voice.
With ever-increasing productivity targets in mining operations, there is a growing interest in mining automation. The PIMM project addresses the fundamental challenge of network communication by constructing a pilot 5G network in the underground mine Kankberg. In this report, we discuss how such a 5G network could constitute the essential infrastructure to organize existing systems in Kankberg into a system-of-systems (SoS). In this report, we analyze a scenario in which LiDAR equipped vehicles operating in the mine are connected to existing mine mapping and positioning solutions. The approach is motivated by the approaching era of remote controlled, or even autonomous, vehicles in mining operations. The proposed SoS could ensure continuously updated maps of Kankberg, rendered in unprecedented detail, supporting both productivity and safety in the underground mine. We present four different SoS solutions from an organizational point of view, discussing how development and operations of the constituent systems could be distributed among Boliden and external stakeholders, e.g., the vehicle suppliers, the hauling company, and the developers of the mapping software. The four scenarios are compared from both technical and business perspectives, and based on trade-off discussions and SWOT analyses. We conclude our report by recommending continued research along two future paths, namely a closer cooperation with the vehicle suppliers, and further feasibility studies regarding establishing a Kankberg software ecosystem.
With ever-increasing productivity targets in mining operations, there is a growing interest in mining automation. In future mines, remote-controlled and autonomous haulers will operate underground guided by LiDAR sensors. We envision reusing LiDAR measurements to maintain accurate mine maps that would contribute to both safety and productivity. Extrapolating from a pilot project on reliable wireless communication in Boliden's Kankberg mine, we propose establishing a system-of-systems (SoS) with LIDAR-equipped haulers and existing mapping solutions as constituent systems. SoS requirements engineering inevitably adds a political layer, as independent actors are stakeholders both on the system and SoS levels. We present four SoS scenarios representing different business models, discussing how development and operations could be distributed among Boliden and external stakeholders, e.g., the vehicle suppliers, the hauling company, and the developers of the mapping software. Based on eight key variation points, we compare the four scenarios from both technical and business perspectives. Finally, we validate our findings in a seminar with participants from the relevant stakeholders. We conclude that to determine which scenario is the most promising for Boliden, trade-offs regarding control, costs, risks, and innovation must be carefully evaluated.
As Netscape co-founder Marc Andreessen famously remarked in 2011, software is eating the world – becoming a pervasive invisible critical infrastructure. Data on the distribution of software use and development in society is scarce, but we compile results from two novel surveys to provide a fuller picture of the role software plays in the public and private sectors in Sweden, respectively. Three out of ten Swedish firms, across industry sectors, develop software in-house. The corresponding figure for Sweden’s government agencies is four out of ten, i.e., the public sector should not be underestimated. The digitalization of society will continue, thus the demand for software developers will further increase. Many private firms report that the limited supply of software developers in Sweden is directly affecting their expansion plans. Based on our findings, we outline directions that need additional research to allow evidence-informed policy-making. We argue that such work should ideally be conducted by academic researchers and national statistics agencies in collaboration.
The area of Internet of Things (IoT) is growingand it affects a large amount of users, which means thatsecurity is important. Many parts of IoT systems are builtwith Open Source Software, for which security vulnerabilitiesare available. It is important to update the softwarewhen vulnerabilities are detected, but it is unclear to whatextent this is done in industry today. This study presentsan investigation of industrial companies in the area ofIoT to understand current procedures and challenges withrespect to security updates. The research is conducted asan interview study with qualitative data analysis. It is foundthat few companies have formalized processes for this typeof security updates, and there is a need to support bothproducers and integrators of IoT components
Quality requirements is a difficult concept in software projects, and testing software qualities is a well-known challenge. Without proper management of quality requirements, there is an increased risk that the software product un-der development will not meet the expectations of its future users. In this pa-per, we share experiences from testing quality requirements when developing a large system-of-systems in the public sector in Sweden. We complement the experience reporting by analyzing documents from the case under study. As a final step, we match the identified challenges with solution proposals from the literature. We report five main challenges covering inadequate re-quirements engineering and disconnected test managers. Finally, we match the challenges to solutions proposed in the scientific literature, including in-tegrated requirements engineering, the twin peaks model, virtual plumblines, the QUPER model, and architecturally significant requirements. Our experi-ences are valuable to other large development projects struggling with testing of quality requirements. Furthermore, the report could be used by as input to process improvement activities in the case under study.
Motivation: Society's dependence on Open Source Software (OSS) and the communities that maintain the OSS is ever-growing. So are the potential risks of, e.g., vulnerabilities being introduced in projects not actively maintained. By assessing an OSS project's capability to stay viable and maintained over time without interruption or weakening, i.e., the OSS health, users can consider the risk implied by using the OSS as is, and if necessary, decide whether to help improve the health or choose another option. However, such assessment is complex as OSS health covers a wide range of sub-topics, and existing support is limited. Aim: We aim to create an overview of characteristics that affect the health of an OSS project and enable the assessment thereof. Method: We conduct a snowball literature review based on a start set of 9 papers, and identify 146 relevant papers over two iterations of forward and backward snowballing. Health characteristics are elicited and coded using structured and axial coding into a framework structure. Results: The final framework consists of 107 health characteristics divided among 15 themes. Characteristics address the socio-technical spectrum of the community of actors maintaining the OSS project, the software and other deliverables being maintained, and the orchestration facilitating the maintenance. Characteristics are further divided based on the level of abstraction they address, i.e., the OSS project-level specifically, or the project's overarching ecosystem of related OSS projects. Conclusion: The framework provides an overview of the wide span of health characteristics that may need to be considered when evaluating OSS health and can serve as a foundation both for research and practice. © 2022 Owner/Author.
The overall goal of the Road Data Lab (RoDL) project is to establish a community platform around open data for roads. The community today consists of AI Sweden who provide a technical platform for storing and sharing data as well as legal knowledge and expertise, community cooperation with several partner organizations and other data labs spearheaded by RISE, and the continued work on open data in general with Lund University and others. We have also during the project made a number of data sets available, from the project partners and others. Lastly, we conducted a hackathon with data from the project as a way to disseminate knowledge of our data and promote utilization. We have published 4 data sets as part of RoDL: The Volvo highway data set, the Zenseact data, Hövding data, and a synthetic dataset for pedestrian detection. The datasets are made available under different open licenses. Working with open innovation and open data have an impact on business models. Open-source software is today established and organizations have experience for what part of their software to make openly available. This is not the case for data. One goal of RoDL was to investigate obstacles and solutions for organizations in terms of the business of open data. However, we could only scratch the surface of this problem – mainly from a license perspective. We see a need for future work to better understand and have solutions for organizations in their analysis of the business of open data.
Modern industrial production environments are rapidly transforming.Concepts such as smart industry and Industry 4.0 encompass many expectations onhow digital technology can improve industrial plants. Some strands are betteralgorithms for robotics, better situational awareness through ubiquitous RFID,fewer production interruptions through smarter predictive maintenance, and moreagile production lines enabling greater customization of products. Many of theseideas depend on reliable access to IT services such computing power and dataavailability. If these falters, the benefits will not materialize. Therefore,it is crucial to study the Service Level Agreements (SLAs) that are used toregulate such services.
Digitalization and servitization are impacting many domains, including the mining industry. As the equipment becomes connected and technical infrastructure evolves, business models and risk management need to adapt. In this paper, we present a study on how changes in asset and risk distribution are evolving for the actors in a software ecosystem (SECO) and system-of-systems (SoS) around a mining operation. We have performed a survey to understand how Service Level Agreements (SLAs) - a common mechanism for managing risk - are used in other domains. Furthermore, we have performed a focus group study with companies. There is an overall trend in the mining industry to move the investment cost (CAPEX) from the mining operator to the vendors. Hence, the mining operator instead leases the equipment (as operational expense, OPEX) or even acquires a service. This change in business model impacts operation, as knowledge is moved from the mining operator to the suppliers. Furthermore, as the infrastructure becomes more complex, this implies that the mining operator is more and more reliant on the suppliers for the operation and maintenance. As this change is still in an early stage, there is no formalized risk management, e.g. through SLAs, in place. Rather, at present, the companies in the ecosystem rely more on trust and the incentives created by the promise of mutual future benefits of innovation activities. We believe there is a need to better understand how to manage risk in SECO as it is established and evolves. At the same time, in a SECO, the focus is on cooperation and innovation, the companies do not have incentives to address this unless there is an incident. Therefore, industry need, we believe, help in systematically understanding risk and defining quality aspects such as reliability and performance in the new business environment.
Software products are rarely developed from scratch and vulnerabilities in such products might reside in parts that are either open source software or provided by another organization. Hence, the total cybersecurity of a product often depends on cooperation, explicit or implicit, between several organizations. We study the attitudes and practices of companies in software ecosystems towards sharing vulnerability information. Furthermore, we compare these practices to contemporary cybersecurity recommendations. This is performed through a questionnaire-based qualitative survey. The questionnaire is divided into two parts: the providers' perspective and the acquirers' perspective. The results show that companies are willing to share information with each other regarding vulnerabilities. Sharing is not considered to be harmful neither to the cybersecurity nor their business, even though a majority of the respondents consider vulnerability information sensitive. However, the companies, despite being open to sharing, are less inclined to proactively sharing vulnerability information. Furthermore, the providers do not perceive that there is a large interest in vulnerability information from their customers. Hence, the companies' overall attitude to sharing vulnerability information is passive but open. In contrast, contemporary cybersecurity guidelines recommend active disclosure and sharing among actors in an ecosystem.
Data defined software is becoming more and more prevalent, especially with the advent of machine learning and artificial intelligence. With data defined systems come both challenges - to continue to collect and maintain quality data - and opportunities - open innovation by sharing with others. We propose Open Data Collaboration (ODC) to describe pecuniary and non-pecuniary sharing of open data, similar to Open Source Software. To understand challenges and opportunities with ODC, we ran focus groups with 22 companies and organizations. We observed an interest in the subject, but we conclude that the overall maturity is low and ODC is rare.
Data intense defined software is becoming more and more prevalent, especially with the advent of machine learning and artificial intelligence. With data intense systems comes both challenges – to continue to collect and maintain quality – and opportunities – open innovation by sharing with others.
To understand challenges and opportunities with ODC, we ran 5 focus groups (4 in Lund and 1 in Kista) with companies and public organizations. We had 27 participants from 22 organizations.
Despite an interest to participate and understanding of the potentials of the subject, the overall maturity is low and ODC is rare. For ODC to be successful, there is a need to study technical, organizational, business, and legal aspects further.
Quality requirements deal with how well a product should perform the intended functionality, such as start-up time and learnability. Researchers argue they are important and at the same time studies indicate there are deficiencies in practice. Our goal is to review the state of evidence for quality requirements. We want to understand the empirical research on quality requirements topics as well as evaluations of quality requirements solutions. We used a hybrid method for our systematic literature review. We defined a start set based on two literature reviews combined with a keyword-based search from selected publication venues. We snowballed based on the start set. We screened 530 papers and included 84 papers in our review. Case study method is the most common (43), followed by surveys (15) and tests (13). We found no replication studies. The two most commonly studied themes are (1) differentiating characteristics of quality requirements compared to other types of requirements, (2) the importance and prevalence of quality requirements. Quality models, QUPER, and the NFR method are evaluated in several studies, with positive indications. Goal modeling is the only modeling approach evaluated. However, all studies are small scale and long-term costs and impact are not studied. We conclude that more research is needed as empirical research on quality requirements is not increasing at the same rate as software engineering research in general. We see a gap between research and practice. The solutions proposed are usually evaluated in an academic context and surveys on quality requirements in industry indicate unsystematic handling of quality requirements.
Context: Quality requirements are important for product success yet often handled poorly. The problems with scope decision lead to delayed handling and an unbalanced scope. Objective: This study characterizes the scope decision process to understand influencing factors and properties affecting the scope decision of quality requirements. Method: We studied one company's scope decision process over a period of five years. We analyzed the decisions artifacts and interviewed experienced engineers involved in the scope decision process. Results: Features addressing quality aspects explicitly are a minor part (4.41%) of all features handled. The phase of the product line seems to influence the prevalence and acceptance rate of quality features. Lastly, relying on external stakeholders and upfront analysis seems to lead to long lead-times and an insufficient quality requirements scope. Conclusions: There is a need to make quality mode explicit in the scope decision process. We propose a scope decision process at a strategic level and a tactical level. The former to address long-term planning and the latter to cater for a speedy process. Furthermore, we believe it is key to balance the stakeholder input with feedback from usage and market in a more direct way than through a long plan-driven process.
[Context and motivation] Quality requirements (QRs) are inherently difficult to manage as they are often subjective, context-dependent and hard to fully grasp by various stakeholders. Furthermore, there are many sources that can provide input on important QRs and suitable levels. Responding timely to customer needs and realizing them in product portfolio and product scope decisions remain the main challenge. [Question/problem] Data-driven methodologies based on product usage data analysis gain popularity and enable new (bottom-up, feedback-driven) ways of planning and evaluating QRs in product development. Can these be efficiently combined with established top-down, forward-driven management of QRs? [Principal idea/Results] We propose a model for how to handle decisions about QRs at a strategic and operational level, encompassing product decisions as well as business intelligence and usage data. We inferred the model from an extensive empirical investigation of five years of decision making history at a large B2C company. We illustrate the model by assessing two industrial case studies from different domains. [Contribution] We believe that utilizing the right approach in the right situation will be key for handling QRs, as both different groups of QRs and domains have their special characteristics.
Quality requirements are vital to developing successful software products. However, there exist evidence that quality requirements are managed mostly in an “ad hoc” manner and down-prioritized. This may result in insecure, unstable, slow products, and unhappy customers. We have developed a conceptual model for the scoping process of quality requirements – QREME – and an assessment model – Q-REPM – for companies to benchmark when evaluating and improving their quality requirements practices. Our model balances an upfront forward-loop with a data-driven feedback-loop. Furthermore, it addresses both strategic and operational decisions. We have evaluated the model in a multi-case study at two companies in Sweden and three companies in The Netherlands. We assessed the scoping process practices for quality requirements and provided improvement recommendations for which practices to improve. The study confirms the existence of the constructs underlying QREME. The companies perform, in the median, 24% of the suggested actions in Q-REPM. None of the companies work data-driven with their quality requirements, even though four out of five companies could technically do so. Furthermore, on the strategic level, quality requirements practices are not systematically performed by any of the companies. The conceptual model and assessment model capture a relevant view of the quality requirements practices and offer relevant improvement proposals. However, we believe there is a need for coupling quality requirements practices to internal and external success factors to motive companies to change their ways of working. We also see improvement potential in the area of business intelligence for QREME in selecting data sources and relevant stakeholders.
Data-driven software is becoming prevalent, especially with the advent of machine learning and artificial intelligence. With data-driven systems come both challenges - to keep collecting and maintaining high quality data - and opportunities - open innovation by sharing data with others. We propose Open Data Collaboration (ODC) to describe pecuniary and non-pecuniary sharing of open data, similar to Open Source Software (OSS) and in contrast to Open Government Data (OGD), where public authorities share data. To understand challenges and opportunities with ODC, we organized five focus groups with in total 27 practitioners from 22 companies, public organizations, and research institutes. In the discussions, we observed a general interest in the subject, both from private companies and public authorities. We also noticed similarities in attitudes to open innovation practices, i.e. initial resistance which gradually turned into interest. While several of the participants were experienced in open source software, no had shared data openly. Based on the findings, we identify challenges which we set out to continue addressing in future research.
Software systems are increasingly depending on data, particularly with the rising use of machine learning, and developers are looking for new sources of data. Open Data Ecosystems (ODE) is an emerging concept for data sharing under public licenses in software ecosystems, similar to Open Source Software (OSS). It has certain similarities to Open Government Data (OGD), where public agencies share data for innovation and transparency. We aimed to explore open data ecosystems involving commercial actors. Thus, we organized five focus groups with 27 practitioners from 22 companies, public organizations, and research institutes. Based on the outcomes, we surveyed three cases of emerging ODE practice to further understand the concepts and to validate the initial findings. The main outcome is an initial conceptual model of ODEs’ value, intrinsics, governance, and evolution, and propositions for practice and further research. We found that ODE must be value driven. Regarding the intrinsics of data, we found their type, meta-data, and legal frameworks influential for their openness. We also found the characteristics of ecosystem initiation, organization, data acquisition and openness be differentiating, which we advise research and practice to take into consideration. © 2021 The Author(s)
Systems of systems consist of independently owned, operated, and developed constituent systems that work together for mutual benefit. Co-opetitive systems of systems consist of constituent systems that in addition also compete. In this paper, we focus on quality requirement engineering for a constituent systems developer in such SoS. We discuss the needs and requirements of a structured quality requirements engineering process, with examples taken from the transportation domain, and find that there is a need for mediators and agreements between constituent systems developers to enable quality data exchange.
Software engineering is decision intensive. Evidence-based software engineering is suggested for decision-making concerning the use of methods and technologies when developing software. Software development often includes the reuse of software assets, for example, open-source components. Which components to use have implications on the quality of the software (e.g., maintainability). Thus, research is needed to support decision-making for composite software. This paper presents a roadmap for research required to support evidence-based decision-making for choosing and integrating assets in composite software systems. The roadmap is developed as an output from a 5-year project in the area, including researchers from three different organizations. The roadmap is developed in an iterative process and is based on (1) systematic literature reviews of the area; (2) investigations of the state of practice, including a case survey and a survey; and (3) development and evaluation of solutions for asset identification and selection. The research activities resulted in identifying 11 areas in need of research. The areas are grouped into two categories: areas enabling evidence-based decision-making and those related to supporting the decision-making. The roadmap outlines research needs in these 11 areas. The research challenges and research directions presented in this roadmap are key areas for further research to support evidence-based decision-making for composite software. © 2021 The Authors.