Virtual augmented reality first person shooter computer game created using imagery capture through the the process of arial photogrammetry using a self built autonomous quadcopter.
The purpose of this project was to not just produce a detailed demonstration of an obscure programming construct in the form of a program nobody would ever use but to create an amazing and captivating experience. Instead of creating a virtual environment from scratch, a real-world environment would be captured and then augmented with 3D graphics editing and the addition of game mechanics to create a playable environment that was recognisable. At the time photogrammetry was in its infancy and only just being realised for its huge potential in the game industry but now is used in for example the creation of rocks in PUBG. This process involves capturing pictures of an object or environment from all angles and then processing this imagery through software such as Agisoft to create a rendered 3D model. There would be a large demand on graphics capability to I looked into using a trial of AWS, hiring rigs and eventually reached out to Barclays Lab who offered the use of one of their VR capable computers. While I could have captured the aerial footage using a DJI Phantom I ventured onto a hole new project to create an autonomous quadcopter which since its growth into one of my largest project has meant that I have not had the time to approach the reminder of this project. The main development environment of choice would be Unreal Engine because of the companies early adoption and support for photogrammetry which is now in use in many games. As per the name this project was originally intended for my Class 12 project which I continued to develop in my own time once leaving for a state school and later managed to adapt to achieve an [A] for a EPQ Extended Project Qualification.
Project 12 is the body of work covering a range of technologies.
with the aim of creating a first person shooter
well turns out ,drone’s are used for lots of applications such as monitoring land for agriculture, or surveying mining operations or producing maps.
Project 12 is the body of work covering a range of technologies.
H2BAQ - How To Build A Quadcopter
However one of the main problems with accomplishing Project 12 was the difficulty in capturing footage from such heights in an automated fashion. To solve this, a subsidiary project was born to create a quadcopter with full autonomous capability that could follow a grid pattern in the sky programmed by a ground control station running on a computer. After finding so little documentation for the open source quadcopter I was constructing, I decided to embark on a subproject to fully document the autonomous quadcopter’s construction in the form of a youtube channel called H2BAQ which simply stands for How to Build A Quadcopter. An example of how the workload quickly expended is when instead of simply downloading an intro video template, it took the time to realise my vision of what the animation should look like.
I had to learn more about computer animation but also the superficialities of Blender which turns out to be one of the most convoluted and non-sensical programs even for open source software. For only 10 seconds of animation I would have spent weeks to learn around the subject. The youtube channel also needed unique artwork, logo, theme, general marketing, social media and a unique tailored website.
Video Editing & Production
Some £200 was spent to acquire Apple's Final Cut Pro X video editing and compositing software which was used in combination with its siblings Compressor and Motion to produce all the videos.
Other video editing software sites were considered such as Corel, Avid and Adobe CC. However FCPX made for a seamless transition in user experience design from the iMovie which I had already used extensively and knew every part of.
The most valuable experience gained in the development of the videos was the affect that a streamlined and efficient workflow could have on development times. This included both small details in keyboard shortcuts used to the grand scheme of what work was to be done first.
A well used planning tool was storyboarding which allowed for rapid and unbounded prototyping of the shot order and what material would need to be completed to illustrate the videos information. A custom template was made and printed out with many variations for each video. Captions and annotations and crucial directions for what and how content would be shot. This was vital as many mistakes were made with missing content for videos needing to be reshot post editing. Having the storyboard as reference allowed for tasks to be crossed off when completed to make sure nothing was missed. It also made for a list of all secondary sourced B roll that would need to be indexed for later import into the post shooting part of the workflow.
A large folder hierarchy was continuously mutating to fit the needs of the workflow to provide all reran material and assets to hand. This was very important when implementing footage in an ordered manner through the videos where sorting through unordered data wasted much time.
Even when the content used was secondary, it was reedited and imagined in a new context. A piece of downloading software was installed to make sure no quality was lost. However this was not used in the ‘RC Inspire’ video where a lower quality can be seen. Aspects such as this became obvious but were learnt through the workflow.
While longer more compressive video could have been made to encompass the information divulged over several, there was the viewers attention span and interest to consider. Another advantage of separating different topics is so that they would appear in searches.
Intro Clip Animation
While widely being accredited with being the most convoluted, user distressing, nonsensical and frustrating software available to date, Blender is also known for its huge potential of capabilities and myriad of features for producing professional industry recognised results. While this sounds good free software, it must be recognised that this was also under the GPL and thus freely development by the community. This made learning how to create the initial 10 second introduction animation take roughly a month and was demoralising, taking every ounce of committed persistence and patience. Eventually after many iterations and failed exports, each frame of the animation was rendered using raytracing, taking roughly forty-eight hours on a 2014 MacBook Air Intel Iris 5000 4GB i5.
Keyframes were added in the timeline for each object to generate the paths of each within the 3D environment. The project physics were based on an intro file downloaded from a youtube channel
Each aspect of the brand experience was considered so that all would tie together and be quickly identifiable for the H2BAQ project. Bold colours and fonts were used in combination with clipart and pictures to clearly state the content of the video even at small scale. An intriguing background was used through all of the main series so that they could be recognised as such and be the next chosen video by the viewer.
From comprehensive research and through the process of trial and error, scripts were written to give a full and comprehensive guide to building the quadcopter. The documents are for use within the project only and thus are unformatted and written for best results using the default computerised voice generator installed with Macs for Australian users. They were written for easy understanding by the general public and for maintained interest throughout the videos by the viewers. Extra lines and punctuation were used to ensure that there was enough space in between words for post editing the waveform to separate out dialogue for posting according to the footage.