All work is copyrighted to their respective owners © 2018 | Tom Pugh, tompughdesigner@gmail.com

  ShopBot Level Design

Game Video

Software Used

 

-Unreal Engine 4

-ACID music Studio 2010

-Photoshop

-Maya 2015

 

 

Team Size

 

.4 person

 

Duration

 

.Less than 3 Months

 

Roles

 

-Level Designer

-Game Designer

-Project Manager

-Audio Designer

-Scripting (secondary role)

-3D modelling (secondary role)

 

Project Category

 

This was a University Project.

Below is a list of the other team members who worked on this project, this project wouldn't have been completed without them so take a look at their portfolios if your interested in this piece of work;

 

Luke Norman: http://lwnorman.wix.com/portfolio

Rob Green: robgotbanned.wix.com/portfolio

Michael Hindley: michaelhindley.prosite.com

 

My primary roles in the team were project management, level design, audio design, game design, a small amount of scripting and I assisted with some of the 3D modelling of props. 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

For this assignment we had to create a level with two states, a present state and a future state. We also had to create gameplay for the level as well as a cut scene that narratively describes the reason for the levels second state.

 

This led to us making the game ‘ShopBot’. ‘ShopBot’ is a non-violent game about a robot advertising to potential customers to try and get them into a store. For each person the player gets into the store they will be given a score.

 

We decided to make a non-violent game because we felt it would stand out among the other levels being created. Moreover it was an excellent challenge trying to make a game which had no violence in it.

 

This thinking led to us creating our core mechanics which are the upgrade system, charge level (which effects how long the player can move around for until they need to re charge), Whales (big spenders who give the player a bonus score), Business hours which determines how long the player has until the shop closes and the advertisment dispersal which is what the player will use to attract the AI into the store.

 

We also decided that we would use an Xbox 360 gamepad and the A,X,Y and B buttons to use the advertisment dispersal. This is what led to there being 4 different colour AIs. They are all a different colour because they will only be attracted to the player when they use the right colour button.

 

This was the first project at university where we were given the opportunity to work with people from different disciplines. The team consisted of 2 artists and 2 designers. I learnt a lot of new techniques from the artisits and I also learnt how they function and the pprocess they go through to get artwork completed for a game. This was one of the most valuable things I learnt from this project.

 

The scripting for this game was completed using Blueprints In Unreal Engine 4.

 

 

 

 

We began by taking measurements of the area and putting these measurements onto a level plan. This was cruicial for the artists, it meant that they could look at a glance and check the real life measurments of the area they were working on. From a design perspective it was important so that we knew the size of the area that we were putting gameplay elements into. 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

This led to us putting together a second plan that placed all of the gameplay elements. On this plan you can see the AI spawn points, player spawn point, the player travel radius and the paths that the AI will be taking through the area.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Setting up the character AI and their spawn points wasn’t too difficult. However we only wanted a certain amount of AI to spawn and we wanted this to happen in waves. This led to us creating variables for each AI that would set how many were being spawned and a second variable that counted how many were spawned in the level. These variables are called max spawn and spawn index and they relate to their respective AI. Max spawn and spawn index are compared and if max spawn is greater than spawn index (which is automatically set at 0) then a branch is set to true. The true goes into a set node for the spawn index. Which gets 1 added to it if the branch turns true. The AI are then spawned and then this is looped for all of the other spawn points. If the branch is false then it sets the spawn index to the same as max spawn, this then runs into a delay and once the  delay finishes the spawn index is returned to 0 and the AI can start spawning again. 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

We tried several methods of AI movement before we settled on what you can see in the final game. We initially tried making the AI move between random points within a radius. However this was very unpredictable and sometimes led to un-wanted behaviour. This led to us setting up a behaviour tree, this behaviour tree moves the AI randomly between multiple points in the level and makes them wait at this points for a short amount of time. Even though we managed to set this up we still struggled getting more than one AI to follow the player at a time. This was an unforeseen problem that appeared during the very late stages of development. Therefore we struggled to get this fixed. Even with this problem we feel that we have created some very interesting AI movement and found an interesting way of setting it up.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

To get the behaviour tree working we had to set up a Basic Task and a Blackboard. The blackboard stores variables that will be used in the basic task. With the blackboard set up we needed one more thing to get the behaviour tree to work, this was a basic task. The basic task below takes the variables that have been made it the blackboard and assigns them values. For example it set the goal for the AI to be an actor in the scene. These actors are notes that the AI will move between randomly in the scene. The basic task sets the goal as one of these actors and will do this whenever an AI moves. Once this was done we had to implement the task in the behaviour tree.

 

To get the AI to follow the player we set up a ray cast that hit the AI. If the AI detected this ray cast then their movement destination was changed to the player's location.

 

So the player can hit the AI with advertisements we used a sphere cast to send out the advertisement dispersal. This method worked very well because with the sphere cast we were able to send out the dispersal and move the AI’s target location to the player.

 We set up this ray cast with the blueprint that can be seen below. We also have to set up trace channels so that specific ray casts will only affect AI with that trace channel on them, for example green trace will only affect the green AI because it has the green trace channel attached to it.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

We have several animation sets in our game. We have animations for the ShopBot and animations for all of the AI characters. These animations include forward, backward, idle and basic movement animations. 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The animations for the player character are based off of a speed variable which is set by the player's velocity. As their velocity increases the animations will change from idle to forward. We had to set up this variable and have it working to detect when the player was moving backward so that the reverse animation would play as well. On top of this we added turn left and right animations to the player character. This change in animation is controlled by the state machine which has transition rules connected to tell the engine when to change between animations. Below are images of the blend spaces and other related components.

 

  

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The AI animations are much simpler and just detect the velocity of the AI and changes the blend space between Idle, walking and running.

  

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

We wanted the audio for ‘ShopBot’ to be fun and energetic. This comes across in the main gameplay music which is intended to make the player feel happy, energetic and in a rush. We wanted to use the audio to make the player feel like they were in a rush and racing against the clock, which they are doing in game. I, mixed the music together to try and create a funky theme that fitted the gameplay of ‘ShopBot’. 

 

All of the sound effects have been made by myself, using ACID Music studio and a selection of sound clips that he owns. We wanted the sound effects to give the same effect as the music and have a robotic feel to them. Below is a screenshot of ACID music studio and its audio editing software.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

These sound effects were then imported into Unreal Engine 4 and the WAVs were turned into sound clips. These sound clips were then further edited in Unreal Engine 4. For example the sound that can be heard when the robot uses its advertisement dispersal has been edited in Engine using a modulator. This modulator changes the pitch of the sound clip every time it is used, with this we are able to make a clip sound different so that the player doesn’t get bored hearing the same sound effect.

 

Overall during this project, I feel that my skills as a team mate, project manager, designer, scripter and artist have all improved. I also learnt invaluable knowlegde about Unreal Engine 4 and how to use Blueprints.

 

 

 

Overview

 

Detailed Information