[Context and motivation] Internet of Things (IoT) is becoming common throughout everyday lives. However, the interaction is often different from when using e.g. computers and other smart devices. Furthermore, an IoT device is often dependent on several other systems, heavily impacting the user experience (UX). Finally, the domain is changing rapidly and is driven by technological innovation. [Question/problem] In this qualitative study, we explore how companies elicit UX requirements in the context of IoT. A key part of contemporary IoT development is also data-driven approaches. Thus, these are also considered in the study. [Principal idea/results] There is a knowledge gap around data-driven methodologies, there are examples of companies that collect large amount of data but do not always know how to utilize it. Furthermore, many of the companies struggle to handle the larger system context, where their products and the UX they control are only one part of the complete IoT ecosystem. [Contribution] We provide qualitative empirical data from IoT developing companies. Based on our findings, we identify challenges for the companies and areas for future work.
Video game development is a complex endeavor, often involving complex software, large organizations, and aggressive release deadlines. Several studies have reported that periods of “crunch time” are prevalent in the video game industry, but there are few studies on the effects of time pressure. We conducted a survey with participants of the Global Game Jam (GGJ), a 48-hour hackathon. Based on 198 responses, the results suggest that: (1) iterative brainstorming is the most popular method for conceptualizing initial requirements; (2) continuous integration, minimum viable product, scope management, version control, and stand-up meetings are frequently applied development practices; (3) regular communication, internal playtesting, and dynamic and proactive planning are the most common quality assurance activities; and (4) familiarity with agile development has a weak correlation with perception of success in GGJ. We conclude that GGJ teams rely on ad hoc approaches to development and face-to-face communication, and recommend some complementary practices with limited overhead. Furthermore, as our findings are similar to recommendations for software startups, we posit that game jams and the startup scene share contextual similarities. Finally, we discuss the drawbacks of systemic “crunch time” and argue that game jam organizers are in a good position to problematize the phenomenon.
Software engineering is at the core of the digitalization of society. Ill-informed decisions can have major consequences, as made evident in the 2017 government crisis in Sweden, originating in a data breach caused by an outsourcing deal made by the Swedish Transport Agency. Many Government Agencies (GovAgs) in Sweden are rapidly undergoing a digital transition, thus it is important to overview how widespread, and mature, software development is in this part of the public sector. We present a software development census of Swedish GovAgs, complemented by document analysis and a survey. We show that 39.2% of the GovAgs develop software internally, some matching the number of developers in large companies. Our findings suggest that the development largely resembles private sector counterparts, and that established best practices are implemented. Still, we identify improvement potential in the areas of strategic sourcing, openness, collaboration across GovAgs, and quality requirements. The Swedish Government has announced the establishment of a new digitalization agency next year, and our hope is that the software engineering community will contribute its expertise with a clear voice.
With ever-increasing productivity targets in mining operations, there is a growing interest in mining automation. The PIMM project addresses the fundamental challenge of network communication by constructing a pilot 5G network in the underground mine Kankberg. In this report, we discuss how such a 5G network could constitute the essential infrastructure to organize existing systems in Kankberg into a system-of-systems (SoS). In this report, we analyze a scenario in which LiDAR equipped vehicles operating in the mine are connected to existing mine mapping and positioning solutions. The approach is motivated by the approaching era of remote controlled, or even autonomous, vehicles in mining operations. The proposed SoS could ensure continuously updated maps of Kankberg, rendered in unprecedented detail, supporting both productivity and safety in the underground mine. We present four different SoS solutions from an organizational point of view, discussing how development and operations of the constituent systems could be distributed among Boliden and external stakeholders, e.g., the vehicle suppliers, the hauling company, and the developers of the mapping software. The four scenarios are compared from both technical and business perspectives, and based on trade-off discussions and SWOT analyses. We conclude our report by recommending continued research along two future paths, namely a closer cooperation with the vehicle suppliers, and further feasibility studies regarding establishing a Kankberg software ecosystem.
With ever-increasing productivity targets in mining operations, there is a growing interest in mining automation. In future mines, remote-controlled and autonomous haulers will operate underground guided by LiDAR sensors. We envision reusing LiDAR measurements to maintain accurate mine maps that would contribute to both safety and productivity. Extrapolating from a pilot project on reliable wireless communication in Boliden's Kankberg mine, we propose establishing a system-of-systems (SoS) with LIDAR-equipped haulers and existing mapping solutions as constituent systems. SoS requirements engineering inevitably adds a political layer, as independent actors are stakeholders both on the system and SoS levels. We present four SoS scenarios representing different business models, discussing how development and operations could be distributed among Boliden and external stakeholders, e.g., the vehicle suppliers, the hauling company, and the developers of the mapping software. Based on eight key variation points, we compare the four scenarios from both technical and business perspectives. Finally, we validate our findings in a seminar with participants from the relevant stakeholders. We conclude that to determine which scenario is the most promising for Boliden, trade-offs regarding control, costs, risks, and innovation must be carefully evaluated.
The area of Internet of Things (IoT) is growingand it affects a large amount of users, which means thatsecurity is important. Many parts of IoT systems are builtwith Open Source Software, for which security vulnerabilitiesare available. It is important to update the softwarewhen vulnerabilities are detected, but it is unclear to whatextent this is done in industry today. This study presentsan investigation of industrial companies in the area ofIoT to understand current procedures and challenges withrespect to security updates. The research is conducted asan interview study with qualitative data analysis. It is foundthat few companies have formalized processes for this typeof security updates, and there is a need to support bothproducers and integrators of IoT components
Quality requirements is a difficult concept in software projects, and testing software qualities is a well-known challenge. Without proper management of quality requirements, there is an increased risk that the software product un-der development will not meet the expectations of its future users. In this pa-per, we share experiences from testing quality requirements when developing a large system-of-systems in the public sector in Sweden. We complement the experience reporting by analyzing documents from the case under study. As a final step, we match the identified challenges with solution proposals from the literature. We report five main challenges covering inadequate re-quirements engineering and disconnected test managers. Finally, we match the challenges to solutions proposed in the scientific literature, including in-tegrated requirements engineering, the twin peaks model, virtual plumblines, the QUPER model, and architecturally significant requirements. Our experi-ences are valuable to other large development projects struggling with testing of quality requirements. Furthermore, the report could be used by as input to process improvement activities in the case under study.
Modern industrial production environments are rapidly transforming.Concepts such as smart industry and Industry 4.0 encompass many expectations onhow digital technology can improve industrial plants. Some strands are betteralgorithms for robotics, better situational awareness through ubiquitous RFID,fewer production interruptions through smarter predictive maintenance, and moreagile production lines enabling greater customization of products. Many of theseideas depend on reliable access to IT services such computing power and dataavailability. If these falters, the benefits will not materialize. Therefore,it is crucial to study the Service Level Agreements (SLAs) that are used toregulate such services.
Digitalization and servitization are impacting many domains, including the mining industry. As the equipment becomes connected and technical infrastructure evolves, business models and risk management need to adapt. In this paper, we present a study on how changes in asset and risk distribution are evolving for the actors in a software ecosystem (SECO) and system-of-systems (SoS) around a mining operation. We have performed a survey to understand how Service Level Agreements (SLAs) - a common mechanism for managing risk - are used in other domains. Furthermore, we have performed a focus group study with companies. There is an overall trend in the mining industry to move the investment cost (CAPEX) from the mining operator to the vendors. Hence, the mining operator instead leases the equipment (as operational expense, OPEX) or even acquires a service. This change in business model impacts operation, as knowledge is moved from the mining operator to the suppliers. Furthermore, as the infrastructure becomes more complex, this implies that the mining operator is more and more reliant on the suppliers for the operation and maintenance. As this change is still in an early stage, there is no formalized risk management, e.g. through SLAs, in place. Rather, at present, the companies in the ecosystem rely more on trust and the incentives created by the promise of mutual future benefits of innovation activities. We believe there is a need to better understand how to manage risk in SECO as it is established and evolves. At the same time, in a SECO, the focus is on cooperation and innovation, the companies do not have incentives to address this unless there is an incident. Therefore, industry need, we believe, help in systematically understanding risk and defining quality aspects such as reliability and performance in the new business environment.
Software products are rarely developed from scratch and vulnerabilities in such products might reside in parts that are either open source software or provided by another organization. Hence, the total cybersecurity of a product often depends on cooperation, explicit or implicit, between several organizations. We study the attitudes and practices of companies in software ecosystems towards sharing vulnerability information. Furthermore, we compare these practices to contemporary cybersecurity recommendations. This is performed through a questionnaire-based qualitative survey. The questionnaire is divided into two parts: the providers' perspective and the acquirers' perspective. The results show that companies are willing to share information with each other regarding vulnerabilities. Sharing is not considered to be harmful neither to the cybersecurity nor their business, even though a majority of the respondents consider vulnerability information sensitive. However, the companies, despite being open to sharing, are less inclined to proactively sharing vulnerability information. Furthermore, the providers do not perceive that there is a large interest in vulnerability information from their customers. Hence, the companies' overall attitude to sharing vulnerability information is passive but open. In contrast, contemporary cybersecurity guidelines recommend active disclosure and sharing among actors in an ecosystem.
Data defined software is becoming more and more prevalent, especially with the advent of machine learning and artificial intelligence. With data defined systems come both challenges - to continue to collect and maintain quality data - and opportunities - open innovation by sharing with others. We propose Open Data Collaboration (ODC) to describe pecuniary and non-pecuniary sharing of open data, similar to Open Source Software. To understand challenges and opportunities with ODC, we ran focus groups with 22 companies and organizations. We observed an interest in the subject, but we conclude that the overall maturity is low and ODC is rare.
Data intense defined software is becoming more and more prevalent, especially with the advent of machine learning and artificial intelligence. With data intense systems comes both challenges – to continue to collect and maintain quality – and opportunities – open innovation by sharing with others.
To understand challenges and opportunities with ODC, we ran 5 focus groups (4 in Lund and 1 in Kista) with companies and public organizations. We had 27 participants from 22 organizations.
Despite an interest to participate and understanding of the potentials of the subject, the overall maturity is low and ODC is rare. For ODC to be successful, there is a need to study technical, organizational, business, and legal aspects further.
Context: Quality requirements are important for product success yet often handled poorly. The problems with scope decision lead to delayed handling and an unbalanced scope. Objective: This study characterizes the scope decision process to understand influencing factors and properties affecting the scope decision of quality requirements. Method: We studied one company's scope decision process over a period of five years. We analyzed the decisions artifacts and interviewed experienced engineers involved in the scope decision process. Results: Features addressing quality aspects explicitly are a minor part (4.41%) of all features handled. The phase of the product line seems to influence the prevalence and acceptance rate of quality features. Lastly, relying on external stakeholders and upfront analysis seems to lead to long lead-times and an insufficient quality requirements scope. Conclusions: There is a need to make quality mode explicit in the scope decision process. We propose a scope decision process at a strategic level and a tactical level. The former to address long-term planning and the latter to cater for a speedy process. Furthermore, we believe it is key to balance the stakeholder input with feedback from usage and market in a more direct way than through a long plan-driven process.
[Context and motivation] Quality requirements (QRs) are inherently difficult to manage as they are often subjective, context-dependent and hard to fully grasp by various stakeholders. Furthermore, there are many sources that can provide input on important QRs and suitable levels. Responding timely to customer needs and realizing them in product portfolio and product scope decisions remain the main challenge. [Question/problem] Data-driven methodologies based on product usage data analysis gain popularity and enable new (bottom-up, feedback-driven) ways of planning and evaluating QRs in product development. Can these be efficiently combined with established top-down, forward-driven management of QRs? [Principal idea/Results] We propose a model for how to handle decisions about QRs at a strategic and operational level, encompassing product decisions as well as business intelligence and usage data. We inferred the model from an extensive empirical investigation of five years of decision making history at a large B2C company. We illustrate the model by assessing two industrial case studies from different domains. [Contribution] We believe that utilizing the right approach in the right situation will be key for handling QRs, as both different groups of QRs and domains have their special characteristics.