The overall purpose of this project is to develop a cross-platform CPR (cardiopulmonary resuscitation) application mainly for iOS and Android devices. This application aims to guide healthcare workers in the different processes and expected medication during cardiac arrest, which can often be a fast-paced and stressful scenario. The application will need to provide time-based and previous action-based recommendations for following medication and steps. Each step taken as well as at what time it was performed will also need to be documented automatically through the usage of the application. This application is implemented using the framework React Native. Facebook developed React Native in 2015 targeting mobile application development. The base structure of React Native is based on React, a JavaScript library released in 2013 used to build web interfaces. React Native allows creation of mobile applications that can run on iOS and Android devices with a single codebase. This project resulted in a mobile application capable of running on both iOS and Android platforms. The application has enough functionality to be used in a simulation for the CPR procedure during a cardiac arrest rescue scenario.
Context: Design smell Prioritization is a significant activity that tunes the process of software quality enhancement and raises its life cycle. Objective: A multi-criteria merge strategy for Design Smell prioritization is described. The strategy is exemplified with the case of God Class Design Smell. Method: An empirical adjustment of the strategy is performed using a dataset of 24 open source projects. Empirical evaluation was conducted in order to check how is the top ranked God Classes obtained by the proposed technique compared against the top ranked God class according to the opinion of developers involved in each of the projects in the dataset. Results: Results of the evaluation show the strategy should be improved. Analysis of the differences between projects where respondents answer correlates with the strategy and those projects where there is no correlation should be done. © 2022 The Author(s)
This paper is an experience report of team Halmstad from the participation in a competition organised by the i-GAME project, the Grand Cooperative Driving Challenge 2016. The competition was held in Helmond, The Netherlands, during the last weekend of May 2016. We give an overview of our car’s control and communication system that was developed for the competition following the requirements and specifications of the i-GAME project. In particular, we describe our implementation of cooperative adaptive cruise control, our solution to the communication and logging requirements, as well as the high level decision making support. For the actual competition we did not manage to completely reach all of the goals set out by the organizers as well as ourselves. However, this did not prevent us from outperforming the competition. Moreover, the competition allowed us to collect data for further evaluation of our solutions to cooperative driving. Thus, we discuss what we believe were the strong points of our system, and discuss post-competition evaluation of the developments that were not fully integrated into our system during competition time. © 2000-2011 IEEE.
This paper provides some background and the roadmap of the AUTO-CAAS project, which is a 3-year project financed by the Swedish Knowledge Foundation and is ongoing as a joint project among three academic and industrial partners. The aim of the project is to exploit the formal models of the AUTOSAR standard, developed by the industrial partner of the project Quviq AB, in order to predict possible future failures in concrete implementations of components. To this end, the deviations from the formal specification will be exploited to generate test-cases that can push concrete components to the corners where such deviation will result in observable failures. The same information will also be used in the diagnosis of otherwise detected failures in order to pinpoint their root causes.
Todays age sees more and more devices connected to the internet providing otherwise quite limited hardware with the ability to perform more complex calculations. This project aims to create a system for managing a users radar devices using a cloud platform. The system also provides the ability for the user to upload their own custom applications which can make use of data provided by the radar device, run on virtual machines and if required have the ability to push notifications to the users mobile applications. To simplify the system development, it has been divided into three separate subsystems, specifically the radar device, the cloud service and the mobile application. The result of the project is a complete system with a web application which provides the user with the ability to register their radar device(s), upload source code which is compiled and run on the cloud platform and the ability to send push notices to a mobile application.
We extend the theory of input-output conformance testing to the setting of software product lines. In particular, we allow for input-output featured transition systems to be used as the basis for generating test suites and test cases. We introduce refinement operators both at the level of models and at the level of test suites that allow for projecting them into a specific product configuration (or a product sub-line). We show that the two sorts of refinement are consistent and lead to the same set of test-cases. © Copyright 2014 ACM
We extend the theory of input-output conformance (IOCO) testing to accommodate behavioral models of software product lines (SPLs). We present the notions of residual and spinal testing. These notions allow for structuring the test process for SPLs by taking variability into account and extracting separate test suites for common and specific features of an SPL. The introduced notions of residual and spinal test suites allow for focusing on the newly introduced behavior and avoiding unnecessary re-test of the old one. Residual test suites are very conservative in that they require retesting the old behavior that can reach to new behavior. However, spinal test suites more aggressively prune the old tests and only focus on those test sequences that are necessary in reaching the new behavior. We show that residual testing is complete but does not usually lead to much reduction in the test-suite. In contrast, spinal testing is not necessarily complete but does reduce the test-suite. We give sufficient conditions on the implementation to guarantee completeness of spinal testing. Finally, we specify and analyze an example regarding the Ceiling Speed Monitoring Function from the European Train Control System. (C) 2016 The Author(s). Published by Elsevier Inc.
A major challenge in testing software product lines is efficiency. In particular, testing a product line should take less effort than testing each and every product individually. We address this issue in the context of input-output conformance testing, which is a formal theory of model-based testing. We extend the notion of conformance testing on input-output featured transition systems with the novel concept of spinal test suites. We show how this concept dispenses with retesting the common behavior among different, but similar, products of a software product line. © H. Beohar & M.R. Mousavi.
Regression testing is a means to assure that a change in the software, or its execution environment, does not introduce new defects. It involves the expensive undertaking of rerunning test cases. Several techniques have been proposed to reduce the number of test cases to execute in regression testing, however, there is no research on how to assess industrial relevance and applicability of such techniques. We conducted a systematic literature review with the following two goals: firstly, to enable researchers to design and present regression testing research with a focus on industrial relevance and applicability and secondly, to facilitate the industrial adoption of such research by addressing the attributes of concern from the practitioners' perspective. Using a reference-based search approach, we identified 1068 papers on regression testing. We then reduced the scope to only include papers with explicit discussions about relevance and applicability (i.e. mainly studies involving industrial stakeholders). Uniquely in this literature review, practitioners were consulted at several steps to increase the likelihood of achieving our aim of identifying factors important for relevance and applicability. We have summarised the results of these consultations and an analysis of the literature in three taxonomies, which capture aspects of industrial-relevance regarding the regression testing techniques. Based on these taxonomies, we mapped 38 papers reporting the evaluation of 26 regression testing techniques in industrial settings. © The Author(s) 2019
Performing product development means to simultaneously develop product systems, production processes and marketing efforts. Product Development processes are often complex as they are time dependent and contain many inter-dependencies e.g. between parts and individuals involved Due to the complexity in these processes the methods for research on integrated product development processes need to be designed differently than traditional research methods used. The study of details of totalities can work for mechanical systems but less well for complex systems. For a deeper knowledge of complex systems substantial researcher involvement and participation in real time is essential, which Insider Action Research (IAR) is designed for. IAR can be performed from the three main positions as observer, team member or project leader. The-following conclusions have been drawn (Bjork 2003): The IAR approach is beneficial in most types of development processes and projects when the aim of the research is to achieve increased knowledge and understanding. A mix of detailed and narrative descriptions of a research project provides also practitioners with an opportunity to adopt the findings. The implementation of research results would thereby become facilitated, which is an effect that most researchers as well as practitioners would benefit from.
This thesis was conducted with the objective of reducing water consumption byoptimizing the cooling systems of steam sterilizers. As water is a precious resourcewith great environmental effects, it is important not to waste it. Consequently, thereis a need for a more resource-efficient cooling water system. The project focuses onthe development of a system that more efficiently regulates the cooling water utilization by optimizing temperatures. The goal of the project is to achieve a 20% reduction in water consumption of the GSS-91413 model steam sterilizer manufacturedby Getinge. In order to achieve the goal, changes were made to the cooling systemand control logic of the cooling system. By integrating a proportional valve at theoutlet of the cooling system, the system was pressurized with the coolant resultingin greater energy transfer between the condensate and the coolant. The developedcontrol logic incorporates process data combined with an equation-based approachthat utilizes temperature data to adjust the proportional valve leading to increasedcontrol of the flow of the coolant. As a result, the overall water consumption of thesystem was reduced by more than 50% while the maximal temperature of the systemdid not rise more than 1.5%.
To test a Software Product Line (SPL), the test artifacts and the techniques must be extended to support variability. In general, when new SPL products are developed, more tests are generated to cover new or modified features. A dominant source of extra effort for such tests is the concretization of newly generated tests. Thus, minimizing the amount of new non-concretized tests required to perform conformance testing on new products reduces the overall test effort. In this paper, we propose a test reuse strategy for conformance testing of SPL products that aims at reducing test effort. We use incremental test generation methods based on finite state machines (FSMs) to maximize test reuse. We combine these methods with a selection algorithm used to identify non-redundant concretized tests. We illustrate our strategy using examples and a case study with an embedded mobile SPL. The results indicate that our strategy can save up to 36% of test effort in comparison to current test reuse strategies for the same fault detection capability. © 2017 IEEE.
Simple Network Management Protocol (SNMP) has been the traditional approach for configuring and monitoring network devices, but its limitations in security and automation have driven the exploration of alternative solutions. The Network Configuration Protocol (NETCONF) and Yet Another Next Generation (YANG) data modeling language significantly improve security and automation capabilities. This thesis aims to investigate the feasibility of implementing a NETCONF server on the Anybus CompactCom (ABCC) Industrial Internet of Things (IIoT) Security module, an embedded device with limited processing power and memory, running on a custom operating system, and using open source projects with MbedTLS as the cryptographic primitive library. The project will assess implementing a YANG model to describe the ABCC’s configurable interface, connecting with a NETCONF client to exchange capabilities, monitoring specific attributes or interfaces on the device, and invoking remote procedure call (RPC) commands to configure the ABCC settings. The goal is to provide a proof of concept and contribute to the growing trend of adopting NETCONF and YANG in the industry, particularly for the Industrial Internet of Things (IIoT) platform of Hardware Meets Software (HMS).
Lansen Technology develops and sells alarm systems. The communication between the system devices are wireless. The radio protocol used by the system is developed by Lansen with the purpose to be energy efficient. The alarm systems target groups are individuals, businesses and government agencies. The current system is installed, configured and controlled from a control panel for all audiences. Some operations are also available using a mobile phone via a GSM network. Lansen Technology has a request to move more of the functionality to a mobile device and avoid the cost of using the GSM network used today. The target group for the application is primarily users that want instant access to information within the network. The project has mainly consisted of two parts. The first part of the project was to investigate two different wireless technologies that were relevant to the project. The investigation showed that Bluetooth was the best choice based on the requirements of the alarm system. Bluetooth was then integrated to the existing alarm system. The second part consisted of developing software to operate the alarm system from a PDA running Windows Mobile. An application was developed successfully and it can perform the majority of the functions specified by Lansen. The idea of remote controlling an alarm system from a mobile device has resulted in a successful project. Lansen Technology develops and sells alarm systems. The communication between the system devices are wireless. The radio protocol used by the system is developed by Lansen with the purpose to be energy efficient. The alarm systems target groups are individuals, businesses and government agencies. The current system is installed, configured and controlled from a control panel for all audiences. Some operations are also available using a mobile phone via a GSM network. Lansen Technology has a request to move more of the functionality to a mobile device and avoid the cost of using the GSM network used today. The target group for the application is primarily users that want instant access to information within the network. The project has mainly consisted of two parts. The first part of the project was to investigate two different wireless technologies that were relevant to the project. The investigation showed that Bluetooth was the best choice based on the requirements of the alarm system. Bluetooth was then integrated to the existing alarm system. The second part consisted of developing software to operate the alarm system from a PDA running Windows Mobile. An application was developed successfully and it can perform the majority of the functions specified by Lansen. The idea of remote controlling an alarm system from a mobile device has resulted in a successful project.
Data Visualization (DV) can be seen as an important tool for communication and data analysis. Especially when huge amounts of data are involved, visual representation of data can facilitate observation of trends and patterns as well as understanding. Currently, two dimensional displays are mainly used for Data Visualization, both in two and three dimensions (2D and 3D). However, two dimensional displays are limited in terms of 3D visualization because they do not allow for true sense of depth and do not cover the observer’s full Field Of View (FOV). An alternative approach is to use Virtual Reality (VR), which provides an immersive and interactive 3D environment. VR has been mainly used for gaming and simulated training. However, other areas are now emerging because VR technologies became relatively affordable. For example, one possibility is to explore VR for DV and this was the main goal of this project. To accomplish that, a literature study was performed to identify terminologies and definitions, hardware and software technologies, techniques and examples in the fields of DV and VR. In addition, in order to exemplify DV through VR, a prototype system was implemented using Unity 3D, a leading engine for VR. To visualize the developed VR environment, a HTC Vive Head Mounted Display (HMD) was used. The developed prototype system can display data from a local dataset in a scatter plot with three axis in VR. In the virtual environment created by the system, the user can select the attributes in the dataset to be displayed by the 3D scatter plot. Once the data is plotted, the user can use the handheld joystick to move, rotate, tilt and scale the scatter plot. Achieved results indicate immersion and interaction as the main perceived benefits of DV using VR.
Software Product Line Engineering (SPLE) is an approach used in the development of similar products, which aims at systematic reuse of software artifacts. The SPLE process has several activities executed to assure software quality. Quality assurance is of vital importance for achieving and maintaining a high quality for various artifacts, such as products and processes. Testing activities are widely used in industry for quality assurance. However, the effort for applying testing is usually high, and increasing the testing efficiency is a major concern. A common means of increasing efficiency is automation of test design. Several techniques, processes, and strategies were developed for SPLE testing, but still many problems are open in this area of research. The challenge in focus is the reduction of the overall test effort required to test SPLE products. Test effort can be reduced by maximizing test reuse using models that take advantage of the similarity between products. The thesis goal is to automate the generation of small test-suites with high fault detection and low test redundancy between products. To achieve the goal, equivalent tests are identified for a set of products using complete and configurable test-suites. Two research directions are explored, one is product-based centered, and the other is product line-centered. For test design, test-suites that have full fault coverage were generated from state machines with and without feature constraints. A prototype tool was implemented for test design automation. In addition, the proposed approach was evaluated using examples, experimental studies, and an industrial case study for the automotive domain. The results of the product-based centered approach indicate a reduction of 36% on the number of test cases that need to be concretized. The results of the product line-centered approach indicate a reduction of 50% on the number of test cases generated for groups of product configurations.
Conference proceedings often present successful research and best cases. This paper presents a case that initially did NOT develop as anticipated and reflections as to why outcomes were different than expected. It also suggests important factors to consider before similar activities are undertaken in the future. The case presented investigates reflective assessments for the module "Current Issues in Edutainment Software Design", given to seniors in the Edutainment Software Design program at Halmstad University, Sweden. Throughout their program, these students have been indoctrinated to engage in self-reflection. Module assessments included development of both individual and group papers. Moreover, all students should reflect on their own learning process and produce their thoughts in a diary. Analysis of texts indicate that self-reflection "on command" proved difficult for these students, although they had been "trained" to reflect. Compared with two other groups lacking a similar, reflective background, instructed reflection seemed easier for "untrained" students. © 2003 IEEE
This report is about implementing a real-time monitoring water qualitysystem to measure water quality autonomously in any water environment.The purpose of collecting data is to analyze the results and build a clearpicture to move quickly to find solutions in the event of pollution or any otherdangerous circumstances. The report explains the submarine system thatconnects to the winch system, part of the primary system.Two Raspberry Pi microcontrollers and sensors are used to collect the data.Serial connection is used to build communication between themicrocontrollers in order to transfer data between them. The focus area of thewhole project is the communication part since the prototype’s design is not apart of this thesis. Storing the collected data happens in the centralmicrocomputer in a CSV file. However, it should be stored in a databaseserver to take advantage of this data in the fastest way possible.
This report describes the development of a time reporting system that applies the Near-Field Communication [NFC] technology. The system is primarily intended for a company where the system itself is designed with modern techniques and was developed with the vision of expansion and further development.
A web application has been developed that consists of a server, database, and a web page. The server can receive requests to manage the information stored in the database. The web page can manage and review the time logs in a user interface that can be accessed with a browser.
The system includes a station that uses a NFC-reader to read information from the external NFC devices. The information is then forwarded, with the help of Wi-Fi, to the server to either register a new station or create a time log.
It also has a mobile application developed for mobile phones that use Android as operating system and has built-in NFC support. A mobile phone that supports these criteria may be swept over a station to perform a time log. The mobile phone can carry out create, modify, delete, and view time logs. GPS is integrated for navigation and to connect a position with a time log.
During the last century, various countries' armed forces have used unmanned aerial vehicles, commonly known as drones. In recent years, strives have been made to develop small commercial drones that have allowed the general public to afford and use them for recreational purposes. The availability of drones has led to immoral applications of the technology. Such applications need to be faced with anti-measures and effective detection methods. Therefore, this thesis aims to develop a mobile reconnaissance robot that can detect commercial drones with radar. It describes integrating radar sensors with single-board computers to detect and localise air-bound objects. The finished product aims to be used for educational and exhibition purposes at the Swedish Armed Forces technical school to increase awareness of the technology.
Today wireless technologies are increasing in the automation systems used in homes and buildings. More electrical devices are used in a house to save time, money, and energy because they are relatively inexpensive and easy to install; these devices even allow smart components such as mobile tablets and computer connectivity. To connect all these devices for data transmission purposes and easy access, the KNX is the best choice. The KNX standard is an open standard for home and building automation. KNX standard supports different communication media such as Twisted pairs, Power line, Radio Frequency, and tunnelling IP. KNX system is a bus system for building control, making all electrical and smart devices in a KNX system use the same transmission method and exchange telegrams via a shared bus network. To check and control all the electrical devices in a home or an apartment takes time; that is why there is a massive need for applications to make every room’s controlling process much easier and take a much shorter time. This project is about designing and implementing a visualization application for windows and .NET for managing and comparing input data with the actual data. This application is equipped with a KNX bus driver to communicate with hardware in a building. The practical part of the application is to take some raw data and then sort them in a specific way to minimize the time of controlling the process of the KNX devices in a building.
Validation and verification of Software Product Lines is particularly challenging due to the complex structure and interaction of commonalities and variabilities among products. There are several approaches to specify the structure of such commonalities and variabilities, such as the delta-oriented approach. Building upon such a structure, we propose an approach to avoid redundant analysis in Software Product Lines by extending them to semantic behavioural changes. To this end, we propose to use Differential Symbolic Execution, an automated technique for proving functional behavioural equivalence based on satisfiability modulo theories. Our proposal aims at identifying the behavioural commonalities of one software product relative to another and exploits them in order to establish an efficient model-based testing trajectory.
The web application esMatrix is developed by Entergate AB. The application is used to calculate and create priority matrices. Entergate wants to develop a new software with esMatrix as the base, since it does not perform well enough. The new application should manage data in a more efficient way and all functions from esMatrix should be optimized. New functions should also be added to the new software. The project is a degree project. This report will explain different theories and methods that are used to develop the new application. The result of this project is an application, which performs better than the previous one when calculating the matrices. Also functions have been added as coloring the matrices, export of documents to PDF files and downloading exported documents as a ZIP file.
This short paper discusses a handful of perhaps obvious, but important observations about KeY, the state-of-the-art deductive verification tool for Java programs. Two light research ideas surface out during the admittedly divergent discussion, both of which seem to be little explored, at least in the given context. Not all projects survive for as long as KeY does, it takes a good idea and dedicated people for that to happen. Hence, the paper also contributes with a formally proved correspondence between using KeY and being a good researcher. Apart from that, considering the occasion to which this paper is dedicated, a handful of memories about Prof. Hähnle are also shared. © 2022, Springer Nature Switzerland AG.
Needs exist for cheap treadmills that control speed automatically according to the user's step. Needs are primarily in healthcare for rehab involving patients with different mobility and balance issues for which current solutions are very expensive. Pros are the ability to improve treatment, increase availability and safety without the need for equipment worn by the user. To solve this a system consisting of treadmill and sensors was created. The system can track the user's position and posture and control the treadmill's speed accordingly. Posture is tracked by an algorithm that identifies risks as signs of lost balance and fall indicating that the treadmill should slow down or stop. A prototype was created during an earlier project but it had problems and weaknesses. Issues with the laser sensor used for measuring distance were investigated and delay in the control of the treadmill's motor was tested. To avoid the delay attempts were made to identify the communication protocol used by the treadmill to achieve better speed control. New algorithms for speed control and risk detection were created. The original system was modified to add new functionality and tools for testing. Results show that the sensor issues were caused by electromagnetic interference from the treadmill interfering with the I2C-protocol used with the sensor. A comparison of the laser and kinect show that the laser is not needed and that the Kinect is more stable. The new system's software creates a more modular environment for testing algorithms. It can be difficult to adjust treadmills depending on their construction without rebuilding them to reach a smoother speed control. The system and algorithms can given an appropriate motor control scheme achieve the purpose.
A set of new algorithms and software tools for automatic protein identification using peptide mass fingerprinting is presented. The software is automatic, fast and modular to suit different laboratory needs, and it can be operated either via a Java user interface or called from within scripts. The software modules do peak extraction, peak filtering and protein database matching, and communicate via XML. Individual modules can therefore easily be replaced with other software if desired, and all intermediate results are available to the user. The algorithms are designed to operate without human intervention and contain several novel approaches. The performance and capabilities of the software is illustrated on spectra from different mass spectrometer manufacturers, and the factors influencing successful identification are discussed and quantified.
Wireless communication is important in data transportation between devices. It is as ever important to find the best connection fitted for the users specific requirements. This process can be a hassle and having an application testing all the options could save both time and resources. This project involves solving mentioned problem and to create an application implementing the solution.
A method of recommending a connection is to measure different metrics e.g throughput and packet loss for Bluetooth (BT) and each channel on 2.4GHz and 5GHz Wireless Area Network (WLAN). By using an appropriate algorithm to rate the connections a result ranging from best to worst can be displayed to the user. Limitations on metrics can be inputted by the user to tailor the result to match user requirements.
The application is based on Hardware Meets Software (HMS) product Anybus Wireless Bridge II. With a Telnet connection Attention-commands (AT-commands) can be used to configure and utilize the modules e.g measure signal strength. Third-party services POCO and iPerf are used to measure other metrics e.g throughput and latency. C++ and Visual Studio was used to develop the application.
In accordance to the test specification the projected resulted in a successfully working application. Tests prove that with interference on a channel results in a worse rating and that other channels that are unaffected gets a higher ranking on the list. There is still room for improvement regarding exception handling when connection timeouts happen due to loss of signal.
This paper highlights the collaboration between industry and academia in research. It describes more than two decades of intensive development and research of new hardware and software platforms to support innovative, high-performance sensor systems with extremely high demands on embedded signal processing capability. The joint research can be seen as the run before a necessary jump to a new kind of computational platform based on parallelism. The collaboration has had several phases, starting with a focus on hardware, then on efficiency, later on software development, and finally on taking the jump and understanding the expected future. In the first part of the paper, these phases and their respective challenges and results are described. Then, in the second part, we reflect upon the motivation for collaboration between company and university, the roles of the partners, the experiences gained and the long-term effects on both sides. Copyright © 2014 ACM.
The development, testing and evaluation of novel approaches to Intelligent Environment data processing require access to datasets which are of high quality, validated and annotated. Access to such datasets is limited due to issues including cost, flexibility, practicality, and a lack of a globally standardized data format. These limitations are detrimental to the progress of research. This paper provides an overview of the Open Data Initiative and the use of simulation software (IE Sim) to provide a platform for the objective assessment and comparison of activity recognition solutions. To demonstrate the approach, a dataset was generated and distributed to 3 international research organizations. Results from this study demonstrate that the approach is capable of providing a platform for benchmarking and comparison of novel approaches.
Model-based testing is one of the promising technologies to increase the efficiency and effectiveness of software testing. This paper discusses model-based testing in general, it presents the model-based testing tool TORXAKIS, and it shows how TORXAKIS was applied to test a file synchronization system, Dropbox, revisiting an experiment presented in (Hughes, Pierce, Arts, & Norell, 2016).
Recent advances in hardware, software, and communication technologies are enabling the design and implementation of a whole range of different types of networks that are being deployed in various environments. One such network that has received a lot of interest in the last couple of years is the Vehicular Ad-Hoc Network (VANET). VANET has become an active area of research, standardization, and development because it has tremendous potential to improve vehicle and road safety, traffic efficiency, and convenience as well as comfort to both drivers and passengers. Recent research efforts have placed a strong emphasis on novel VANET design architectures and implementations. A lot of VANET research work have focused on specific areas including routing, broadcasting, Quality of Service (QoS), and security. We survey some of the recent research results in these areas. We present a review of wireless access standards for VANETs, and describe some of the recent VANET trials and deployments in the US, Japan, and the European Union. In addition, we also briefly present some of the simulators currently available to VANET researchers for VANET simulations and we assess their benefits and limitations. Finally, we outline some of the VANET research challenges that still need to be addressed to enable the ubiquitous deployment and widespead adoption of scalable, reliable, robust, and secure VANET architectures, protocols, technologies, and services.