Everything gets started since 1993 when I have started playing games by ATARI 2600. Since that time I always had a dream of creating my own game. In order to make my dream happens, I tried to learn different things including computer programming, modeling, and animation techniques. After my ATARI, I bought a Commodore 64. By using its Basic language, I could write codes to present some simple animations like a 2D elephant movements and a 2D face which are moving his lips. One year later, I bought an IBM compatible computer that has 80286 CPU with higher potentials for programming.
After my ATARI, I bought a Commodore 64. By using its Basic language, I could write codes to present some simple animations like a 2D elephant movements and a 2D face which are moving his lips. One year later, I bought an IBM compatible computer that has 80286 CPU with higher potentials for programming more advanced and efficient games. That year, I learned Q-Basic and wrote a Dart game in Q-Basic which was printed and described in a magazine.
When I was 10, I and my brothers decided to recreate the first level of “The Last Ninja 3” of Commodore on PCs using Q-Basic. I was responsible for designing of objects. When I got 12 years old, I started to work with 3D Studio Max. I created a gallery on 3dlinks’ website and put my objects and works on it. When I was 13, I created a short animation and send it for one of the IRIB2 (national-wide) TV Programs and they broadcasted my work. When I was 15, my high school held a national congress of mathematics for students. As my teachers were familiar with my works, they asked me to create introduction animation of the congress. Additionally, in this congress I and my friend submit a function plotter application which was awarded the first prize. That year, we sent the application for National Organization for Development of Exceptional Talents Festival and won a Fellowship. The next year, I wrote a document about how to poly-model the head of a famous character of The Incredibles  animation in 3D Studio Max and again I was awarded a fellowship from NODET Festival.
Before my enrollment in the Hamedan University of Technology (state university), I wrote an article on what techniques is used to create Ice Age: The Meltdown animation using the information on the internet. The document was published in RayanehKhabar Magazine, which is one of the famous Iranian monthly magazines.
When I entered the university, I studied with seriousness and enthusiasm. My projects always were top listed in the class. I got Teacher Assistantship for 5 different courses. Sometimes, instructors asked me to go to their classes instead of them and teach some parts of their syllabus.
After the third semester, I and 3 of my friends under the supervision of our department head shaped a Computer Game Research Team. We studied different aspects of games such as software engineering, game engines, and 3d game programming. In this project, I had the role of software engineer, programmer and the head of the team. We had used Ogre Graphics Engine and OpenAL. Additionally, I wrote a script for 3D Studio Max in MaxScript which be able us to design our game levels in 3dsmax environment and define its rules in script’s GUI. Then, export the resulted scene in a file. Our game was able to regenerate the game environment and apply the rules of the game to it using the exported file.
Later, I worked with DirectX 9 API to implement my own graphics engine. My graphics engine was able to load OBJ files with their texture maps, handle particle systems and create GUI for an application.
In my thesis, I used the non-intentional gesture inputs of the player to measure the affective state of the player in order to dynamically adapt the game environments’
specifications based on the player’s affective state. To measure the affective state of the player, I used the two-dimensional continues activation/evaluation space. In particular, to measure the affective state of the player, I analyze the inputs from the player and the manner of his playing. I analyzed the involuntary movements and key hold times. This information allows understanding affective states as important indications for player experience [2, 3]. For dynamic adoption of game levels, I have created a special game engine. This game engine considered the measured involuntary feedback and adopted the scene. I used DX10 to implement the graphics engine and used property scripting for the definition of game levels and adoption rules. As a major benefit of my approach, the game level better fits user’s gameplay skills, and, thus increases the fun level. Another benefit of my approach is that it is unobtrusive and no additional sensors such as neuro-biological measurement devices are required. A third benefit is that using off-the-shelf input devices and involuntary feedback to analyze affective states is performed at no additional cost because no additional hardware and measurement setup are required.
As a proof-of-concept implementation and based on my own engine I have prototyped a 3D Game that was similar to the famous “Sabotage” game. In this game, the player must shoot at some falling bombs before they fall on his base. Players are requested to shoot the nearest bomb with a minimum handshake, as soon as he can. My game considers these parameters to determine the ground’s gravity of game environment. The scientific contribution of my thesis was design, implementation, and evaluation of a game. Compared to [4, 5, 6], it needs no additional input device for playing and could target almost every computer or gaming device.
I have implemented a face generator application using DirectX9 and MFC (Microsoft Foundation Classes). It takes a picture from the front view of a face, considers the position of face parts in the picture and uses a similar reference 3d face to construct a face similar to the person whom the picture belongs to. The application could do texturing using RBF Texture Mapping and Linear Texture Mapping methods.
 Mostajabodaveh, M. Incorporating Affective States of Players in Video Games. Thesis to take B.Sc. of Computer Engineering at Hamedan University of Technology (HUT), Iran – Computer Engineering Department.
 Maehr, W. eMotion: Estimation of the User’s Emotional State by Mouse Motions. 2008. CDM Verlag Dr. Muller Publishing Inc.
 Sykes, J., and Brown, S. 2003. Affective Gaming: Measuring emotion through the gamepad. CHI 2003, Ft. Lauderdale, Florida, USA. 732-733.
 Bersak, D., Mcdarby, G., Augenblick, N. McDarby, P., McDonnell, D., McDonnell, B. and Karkun, R. 2001. Intelligent biofeedback using an immersive competitive environment. Proceedings of UbiComp 2001 Workshop on Ubiquitous Computing.
 Reynolds, C. Picard, R. Benton, S. The Sensing and Measurement of Frustration with Computers. 2001. Thesis to take M.Sc. of Media Arts and Sciences at the MIT.
 Kuikkaniemi, K. Turpeinen, M. Saari, T. Kosunen. I. Ravaja, N. 2010. The Influence of Implicit and Explicit Biofeedback in First-Person Shooter Games. ACM CHI 2010.