• No results found

The project period will consist of several phases. Each phase will be outline the outcome from the previous phase, and follow a strict plan.

To ensure progression, a detailed plan will be made. This section will contain information about each period, and what importance it plays for the project. Each project period will be different sprints.

Sprint 1 - Background research and information gathering

When the problem statement is defined, a plan for information gathering must be defined. The problem statement must be put on the agenda, and evaluated against similar approaches and projects. Relevant literature must be found and used to evaluate what is relevant to use for the project plan.

Sprint 2 - Project planning

The project implementation period must be put under the scope, and an analysis must be made outlining what factors to consider when planning and designing the project. Stakeholders must be included in this phase to make sure that the project plan is applicable with the end-user needs. A risk analysis must be made outlining policies and concerns that may change the project outcome, if something unexpected happens.

Sprint 3- Design period

A project design must be made containing models for design and workflows. Use-cases must be defined and decisions on technologies and development-methodologies (SCRUM, Kanban) must be made. A proper development-plan should be made, giving stakeholders the opportunity to monitor the project implementation during the development phase. The design-plan should outline the following:

• Documentation specifications

• Revision control

• Inventory specifications

• Software development methodology specifications to keep track of the development process.

• Technical specifications Sprint 4 - Development period

The development period will focus on implementing a prototype as according to the plan decided in the design period. Revision control and issue tracking will keep track of development progression, and should help the teacher supervise the developer to keep on track when it comes to implementing a proper working solution as according to plan. The development-period will consist of several small sprints with several iterations, where each iteration focuses on solving a specific task.

The use of revision control will make it possible to keep track of change-history during development. If something needs to be to a previous version, revision control can make this possible by reverting the content to an earlier version.

Sprint 5 - Deployment period

The deployment period will focus on building the production infrastruc-ture, and to deploy the code developed during the development phase. If a new change has been made to the code, the module must be redeployed.

The deployment period will focus on keeping the system running, and fo-cus on improvements that can be made to the application and the infra-structure.

Sprint 6 - Analysis period

The analysis period will be at the end of the project period. This sprint will focus on analysing gathered data during the project. The analysis will help conclude the project and will take the problem statement into focus.

Choosing the data to analyse

The goal with the analysis-period is to use the data gathered to help answer the problem statement presented in the introduction-chapter. The adjectives used in the problem statement were described into detail in the beginning of the Approach-chapter. Scaling and value-driven will be key terms when the analysis is conducted. Before the analysis can be started, a plan must be made for how the data should be conducted. The prototype aims to be used in a course at NTNU Gjøvik starting January 2016. The research will consist of real student-data, that can be used directly in different statistical analyses. The different analyses will be outlined in the next subsections.

Uptime Challenge

The students will participate in a challenge called the uptime challenge.

The uptime challenge is a challenge where each student-group participates in a competition where the goal is to administer and operate a web site, and governing the infrastructure running the site. A set of tests will execute against the student sites, and the students will receive reward/punishment costs based on the test-results over time. During the uptime challenge, tests will execute continuously against student-sites and produce new reports and rewards over time.

Evaluating student effort

An analysis looking evaluating the student effort must be conducted. The analysis should investigate if students pay attention to their infrastructures, and look at their progression. The analysis should be conducted by using gathered reports on different groups. The analysis should comparative, and should look at commonalities between reports from different tests for the same group. The commonalities can be used to see the bigger picture of the student’s effort by using different test-results from the same time-periods focusing on different aspects.

Uptime evaluation

Several analyses must be conducted focusing on the uptime for each group.

By looking at the uptime-percentage for each group, it becomes easier to evaluate student-progression. The uptime-percentage will tell the tester the amount of time the student-site was up and running. A site is considered as up if it responds to a request. The uptime-percentage can also contribute to an evaluation of the student-effort in the course.

The uptime-percentage could give the students a better overview of the stability of their site. This could be compared to a company hosting a web site. If the site stops working, the company would lose money. One of the goals with the course is to learn students about the operational perspectives on operating an infrastructure. Two of the main principles when hosting web site is to keep the site up and running at all times, and at the same time consume the minimum required amount of performance.

Scaling analysis

The project aims to investigate different technologies, used to automate scaling in an infrastructure. An analysis must be made outlining one specific scaling-methodology. This analysis should present a set of graphs and numbers indicating how the scaling can be achieved. The aim with the analysis is to evaluate how the number of tests deployed per user can be adjusted in context with the number of running workers.

Student survey

At the end of the project, a survey will be conducted for the students. The survey should evaluate the prototype used in the course, and if the use of the prototype in the uptime challenge motivates the students to work harder. The statistics from the survey will be presented in the Analysis chapter, and discussed in the Discussion chapter.