=Paper= {{Paper |id=Vol-2282/EXAG_105 |storemode=property |title=Deceptive Level Generator |pdfUrl=https://ceur-ws.org/Vol-2282/EXAG_105.pdf |volume=Vol-2282 |authors=Adeel Zafar,Hasan Mujtaba,Mirza Omer Beg,Sajid Ali |dblpUrl=https://dblp.org/rec/conf/aiide/ZafarMBA18 }} ==Deceptive Level Generator== https://ceur-ws.org/Vol-2282/EXAG_105.pdf
                                               Deceptive Level Generator

                             Adeel Zafar, Hasan Mujtaba, Mirza Omer Beg, Sajid Ali
                                   Riphah Int. University and NUCES-FAST, Islamabad
        Email: adeel.zafar@riphah.edu.pk, hasan.mujtaba@nu.edu.pk, omer.beg@nu.edu.pk, sajid.ali123@yahoo.com




                           Abstract                                three different types of traps including greedy trap, smooth-
                                                                   ness trap and generality trap. The experimentation results
  Deceptive games are games where rewards are designed to
  lead agents away from a global optimum policy. In this pa-
                                                                   highlighted the fact that the generated levels of the decep-
  per, we have developed a deceptive generator that generated      tive generator were challenging for different AI agents.
  three different type of traps including greedy, smoothness and      The remainder of the paper is organized as follows. In sec-
  generality trap. The generator was successful in generating      tion 2, we give some background and related work. Section
  levels for a large set of games in the General Video Game        3 explains the strategy and algorithm for level generation.
  Level Generation Framework. Our experimental results show        Section 4 presents the experimental evaluation. Finally, we
  that all tested agents were vulnerable to several kinds of de-   conclude the paper in section 5.
  ceptions.

                                                                                          Background
                       Introduction
Creating manual content for games is a time consuming              Procedural Content Generation
(Togelius et al. 2010) and expensive task. Delegating con-         PCG is a technique of generating gaming content, includ-
tent generation to an algorithmic process can save time and        ing levels (Dahlskog et al. 2014) (Adrian et al. 2013),
money. Procedural Content Generation (PCG) (Shaker et al.          maps (Togelius et al. 2010), music (Jordan et al. 2012),
2010) is such a method, where the algorithmic process is           racing tracks (Kemmerling et al. 2010), weapons (Hast-
used to create a large variety of content including levels,        ings et al. 2009), and terrains (Frade et al. 2012) auto-
maps, textures, and weapons. The recent advancement in the         matically through some pseudo-random process. Though
field of PCG has seen the rise of two different types of algo-     PCG is not a silver bullet for game designers, it has been
rithms for the purpose of automatic generation: Constructive       widely used for Rogue-like games by Indie game develop-
techniques and Search-Based techniques. With the construc-         ers. Automated content generation techniques can help In-
tive technique, content is generated in a single pass without      die game developers to limit the cost and time of devel-
any further iterations. Constructive techniques (Shaker et al.     opment which is showcased by successful games such as
2010) are a simple and fast means of content generation. By        No Mans Sky (https://www.nomanssky.com/), Binding of
contrast, Search-Based techniques (Togelius et al. 2010) re-       Isaac (http://store.steampowered.com) and Faster than Light
generate the content in order to improve their quality. Those      (http://store.steampowered.com).
techniques mostly use an evolutionary algorithm or similar
method for content generation.
   In a recent study, author (Anderson et al. 2018) intro-         VGDL, GVG-AI and GVG-LG Framework
duced the concept of deceptive games. These games are de-
                                                                   Video Game Description Language (VGDL) (Schaul 2013)
signed in a way to trap the agents or controllers playing the
                                                                   was designed by Stanford General Video Game Playing
game. The motivation for designing these type of games is
                                                                   (GVGP). VGDL is a simple, description language to define
focused on a more broader aspect of designing difficulty or
                                                                   a variety of 2D games. General Video Game AI (GVG-AI)
challenge in games for Artificial Intelligent (AI) agents. The
                                                                   (Perez et al. 2016) framework is the basis for general video
study focused on providing a Video Game Description Lan-
                                                                   game playing competition, where participants can create dif-
guage (VGDL) (Schaul 2013) in the General Video Game AI
                                                                   ferent agents and can test them against a variety of games.
(GVG-AI) (Perez et al. 2016) framework. In this paper, we
                                                                   General Video Game Level Generation (GVG-LG) (Khalifa
have built on this idea and have created a deceptive generator
                                                                   et al. 2016) track is built on top of GVG-AI. The framework
that generates the deceptive levels for a large set of games in
                                                                   allows participants to create generators that can generate lev-
the GVG-AI framework. The deceptive generator generates
                                                                   els for a set of different games. Initially, the framework is
                                                                   composed of three sample level generators including ran-
                                                                   dom, constructive and search based generators.
Deceptive Games                                                     Algorithm 1: Fill Up ArrayList
In a recent study (Anderson et al. 2018), the author intro-        1 SpriteData = currentGame.getAllSpriteData();
duced the concept of deceptive games. These games were             2 HashMap = currentGame.getLevelMapping();
designed in accordance to lead the agent away from a global        3 KeySet = HashMap.getKeySet();
optimum policy. To showcase the vulnerability of game              4 foreach key in KeySet do
playing agents, a number of deceptions were designed. Each         5     TempArrayList = HashMap.get(key);
trap was focused on a certain cognitive bias. The results          6     SpriteName = TempArrayList.get(spriteIndex);
showed that different game playing agents had different            7     foreach sprite in SpriteData do
weaknesses. In this paper, we have built on the existing work      8         if sprite.name == SpriteName and
and have created a deceptive generator. The deceptive gen-                     sprite.TypeMatch then
erator generates multiple cognitive traps including greedy         9              ArrayList.Put(key);
trap, smoothness trap and generality trap. For the aforemen-      10         end
tioned problem, the GVG-LG framework was used and the             11     end
algorithm was successful to generate a variety of traps for a
                                                                  12 end
large set of games.

                          Method
Classification of Games                                              Algorithm 2: Generate Greedy Trap
In order to implement deceptions in the game set of GVG-              /* call Initialization Routine..
LG framework, we have thoroughly analyzed all the 91                      This routine would be called in
games. The initial step of exploring all games exhaustively               other traps.                                */
was important as we had to limit the games, where our im-          1 GRIDSIZE = 12 ;
plemented deceptions would not make some sense. While              2 gridChoice = generateRandom(1,2);
exploring the games of the GVG-LG framework, we identi-            3 avatarPos = generateRandom(1,2);
fied that there are some games perfectly correspond to our            /* Initialization Routine ends.. */
deceptive algorithm including zelda, whakmole, and rogue-          4 if gridChoice == 1 then
like. There were overall 31 games that are relevant to our          5    stripRow = GRIDSIZE/4 ;
deceptions and hence our algorithms give the best result on         6    foreach rows >= 1 and rows <= GRIDSIZE do
that. All the rest of the games do not respond positively to        7         foreach cols >= 1 and cols <= GRIDSIZE
our deceptive algorithm. These games had problems such as                      do
unplayability of levels, resource deficiency issues and lack        8             if rows = stripRow AND avatarPos = 1
of requisite environment.                                                           then
                                                                    9                 while cols=keysLength and keyType
Deceptive Generator                                                                     =avatar OR goal OR wall do
The deceptive generator generates three different types of         10                     grid + = keyType.get()
traps including smoothness trap, generality trap, and greedy      11                  end
trap. The overall generation work in four different steps in-     12              end
cluding initialization, greedy trap, smoothness trap and gen-     13              else if rows < stripRow AND rows > 1
erality trap. Before explaining the details of the algorithm,                       then
we would explain the concepts that are necessary to under-        14                  while keys.Type = resourceKeys
stand the working of each algorithm. These concepts are as                              AND harmfulKeys do
follows:                                                           15                     grid + = keyType.get()
                                                                  16                  end
• Game Description: This file specifies how the game is
   played and all the interactions.                               17              end
                                                                  18              else
• Sprites: Sprites are the main game elements. They are de-       19                  while keys.Type=resourceKeys do
   fined in the SpriteSet section of the game description file     20                     grid + = keyType.get()
   (VGDL). In the SpriteSet section, sprites have a name and      21                  end
   a type and some parameter values.                              22              end
• HashMap: The HashMap function returns a hashmap that            23          end
   helps decode the current generated level.                      24     end
   Algorithm 1 extracts all the sprite data by the game object.   25 end
Further to this different ArrayLists have been filled by the
appropriate sprite type. In algorithm 1, (from line 1 to 3) are
responsible for extraction of sprite data, hash map and key
sets associated with each sprite type. The next phase of the         The first trap implemented in our generator is the greedy
algorithm (from line 4 to 12) assigns each sprite with its type   trap. Greedy trap aims to maximize some immediate reward
and populates the ArrayList.                                      and rely on the assumption that a local optimum would guide
them to the global optimum. We took this notion into con-             Algorithm 4: Generate Generality Trap
sideration and designed a greedy trap. Algorithm 2 incorpo-
rates the greedy trap. Initially, an initialization routine (from      /* call Initialization Routine                */
                                                                     1 firstPart = GRIDSIZE/3;
line 1 to 3. Note that this routine would be called in other
                                                                     2 secondPart = ((GRIDSIZE)-firstPart)/2;
algorithms as well) has been called to define the size of the
                                                                     3 thirdPart = (GRIDSIZE)(firstPart+secondPart);
level. The algorithm then divides the level into two parts:
                                                                     4 gameLevel = getGameLevel();
one slightly larger than the other. After dividing the level
                                                                     5 foreach rows >= 1 and rows <= GRIDSIZE do
into two parts, we place the items/sprites excluding harmful
sprites in the larger section of the level (from line 4 to 9). In    6      foreach cols >= 1 and cols <= GRIDSIZE do
the narrow section of the grid, our algorithm places harmful         7          if gameLevel=1 OR gameLevel=2 then
sprites along with collectible sprites to make the greedy trap       8              if rows=firstPart then
more effective.                                                      9                   avatarKeys.get();
                                                                    10              end
                                                                    11              else if rows=secondPart then
   Algorithm 3: Generate Smoothness Trap                            12                   collectibleKeys.get();
    /* call Initialization Routine                   */             13              end
 1 if gridChoice == 1 then                                          14              else if rows=thirdPart then
  2    medianRow = GRIDSIZE/2 ;                                     15                   harmfulSprites.get();
  3    foreach rows >= 1 and rows <= GRIDSIZE do                    16              end
  4        foreach cols >= 1 and cols <= GRIDSIZE                   17          end
            do                                                      18          else if gameLevel= 3 then
  5            if rows = medianRow and avatarPos=1                  19              //Place harmful sprite with goal
                 then                                               20          end
  6                while cols=keysLength and keyType                21      end
                    = avatar OR goal OR wall do                     22 end
  7                    grid + = keyType.get()
  8                end
  9                else if rows < medianRow then
 10                    generateRandomStrips;                           Generality trap exploits the concept of surprise by pro-
 11                    while keys.type=collectibles do              viding a game environment, where a rule is sensible for a
 12                         grid+ =keyType.get();                   limited amount of time. In algorithm 4, we have generated
 13                    end                                          the generality trap. It is worthwhile mentioning that in or-
14                 end                                              der to execute a generality trap, the agent or controller has
15                 else if rows > medianRow then                    to play at least three levels. The first two levels develop the
                                                                    concept of the agent while the third showcases the surprise
16                 end
                                                                    element. The algorithm first calls the initialization function
17                 generateRandomStrips;
                                                                    and then divides the level into three parts (from line 1 to
18                 while keys.type=collectibles AND
                                                                    3). After the initialization step, the algorithm works for each
                    harmful do
                                                                    level separately. If the level is 1 or 2, the algorithm would
 19                    grid+ =keyType.get();
                                                                    keep harmful sprites away from goal sprites so that avatar
20                 end
                                                                    has no experience of combat and it should develop the con-
21             end                                                  cept that fighting with harmful sprites is a useless activity.
22         end                                                      However, when the 3rd or final level was being played, the
23     end                                                          algorithm places the harmful objects in the vicinity of goal
24 end                                                              object to surprisingly break the previously developed con-
                                                                    cept.
   Smoothness trap exploits the assumption that AI tech-
niques rely on that good solutions are close to other good
                                                                                        Experimentation
solutions. This assumption could be exploited by using a            In order to showcase our results, we have generated levels
mechanism that hides the optimal solution close to bad so-          for multiple games. The description of these games are as
lutions. In algorithm 3, we have implemented the idea of            follows:
smoothness trap. Contrary to the greedy trap, in smoothness
                                                                    • Zelda: The avatar has to find a key in a maze to open a
algorithm, we divided the level into two segments. One seg-
                                                                      door and exit. The player is also equipped with a sword
ment of the level was implemented as a smooth path and
                                                                      to kill enemies existing in the maze. The player wins if it
the other as a harsh path. The smooth path (line 6-9) has a
                                                                      exits the maze, and loses if it is hit by an enemy. Refer to
low level of risk and hence avatar is positioned close to col-
                                                                      figure 1 for generated levels of zelda.
lectible and goal sprites. On the other hand, harsh or difficult
path (line 15 to 20) is implemented as a long path with both        • CatPult: The main theme of this game is to reach the exit
collectibles and harmful sprites.                                     door to win. The avatar cannot step on ponds of water,
Figure 1: The figure depicts Zelda game levels. From left to right: generality trap level 1, generality trap level 2, generality trap
level 3, smoothness trap and greedy trap (Note lower boundary is cropped).




Figure 2: The figure depicts Catapult game levels. From left to right: generality trap level 1, generality trap level 2, generality
trap level 3, smoothness trap and greedy trap.




Figure 3: The figure depicts Cakybaky game levels. From left to right: generality trap level 1, generality trap level 2, generality
trap level 3, smoothness trap and greedy trap (Note lower boundary is cropped).




Figure 4: The figure depicts DigDug game levels. From left to right: generality trap level 1, generality trap level 2, generality
trap level 3, smoothness trap and greedy trap.


  however can jump over them using catapults. Each cata-                Baky.
  pult can be used only once. Refer to figure 2 for generated         • DigDug: The objective of this game is to collect all gems
  levels of CatPult.                                                    and gold coins in the cave, digging its way through it.
• Cakybaky: In cakybaky, you have to bake a cake. To do                 There are also enemies in the level that kill the player on
  this task, there are several ingredients that must be col-            collision. Refer to figure 4 for generated levels of DigDug.
  lected in order and to follow the recipe. There are angry           • SolarFox: The main theme of this game is to collect all
  chefs around the level that chase the player, although they           the diamonds on the screen and avoid the attacks of ene-
  only care about their favorite ingredient, so only the ones           mies. The brake of the spaceship controlled by the player
  that prefer the next ingredient to be picked up are active at         is broken, so the avatar is in constant movement. Refer to
  each time. Refer to figure 3 for generated levels of Caky-            figure 5 for generated levels of SolarFox.
Figure 5: The figure depicts SolarFox game levels. From left to right: generality trap level 1, generality trap level 2, generality
trap level 3, smoothness trap and greedy trap.


                                   Table 1: Performance of controllers for generated levels.
                                                  Controllers Performance
         Agents            Algorithm        Win Percent- Normalized           Weakness                      Best      Per-
                                            age             Mean Score                                      forming
                                                                                                            Game
         Number27          Portfolio        52%                  0.60              Smoothness Trap          DigDug
         thorbjrn          MCTS             40%                  0.71              Greedy Trap              Zelda
         OLETS             Optimistic       40%                  0.63              Smoothness/Greedy        CataPults
                           Planning                                                Trap
         NovelTS           Tree             28%                  1.0               Greedy Trap              Zelda
         sampleRS          Random           28%                  0.96              Greedy Trap              CataPults
         sampleRHEA        EA               28%                  0.80              Greedy Trap              CakyBaky
         NovTea            Tree             20%                  0.40              Generality/Greedy        CataPults
                                                                                   Trap
         CatLinux          GA               20%                  0.33              Generality/Greedy        CataPults
                                                                                   Trap
         sampleMCTS MCTS                    16%                  0.45              Greedy Trap              CakyBaky
         sampleOneStep One    State         8%                   0.13              All                      Zelda
                       Lookup


The generated levels were tested on a wide variety of agents            deceptive generator that generates different types of traps
including OLETS, sampleMCTS, sampleRHEA, sampleRS,                      including generality, smoothness and greedy trap. The gen-
NovTea, NovelTS, Number27, sampleOneStep, CatLinux                      erator was successful in generating a large variety of levels
and thorbjrn. Most of the agents were collected from the                for a set of games. In order to test our generator, we gener-
GVG-AI competitions and some are advanced sample con-                   ated example levels for five different games including zelda,
trollers. The agent selection was based on the unique fea-              catapults, cakybaky, solarfox, and digdug. In addition, ten
tures of their algorithms. Each agent was run multiple times            different controllers were tested. The results indicated that
on each level generated by our deceptive generator. The re-             each type of deception had a different effect on the perfor-
sults of the experimentation are shown in table 1. Note that            mance of agents. No single agent was able to solve all type
the agents are ranked according to their win rate and mean              of traps. The best among all was Number27 and least of all
score. In total, 250 levels were played by 10 different con-            was onesteplookahead.
trollers. No single algorithm was able to solve all the de-                In the future, we plan to create more traps within our de-
ceptions. The Number27 agent was the most successful in                 ceptive generator. Preferable, for games where the included
accordance with win percentage and the onesteplookahead                 three traps are not suited. Another important future step is
agent was the least successful of all. It is important to note          the creation of agents or controllers which can solve maxi-
here that the majority of the agents performed well on zelda.           mum traps posted by our deceptive generator.
However, no agent was able to successfully play levels of
SolarFox. In addition, we can see from table 1 that all the                                Acknowledgment
game playing controllers had a different weakness (general-             We acknowledge Riphah International University for sup-
ity, smoothness or greedy trap).                                        port of this research.

           Conclusion and Future Work
Deceptive games are designed to move agents away from
a global optimum policy. In this paper, we have created a
                       References
Anderson, D., Stephenson, M., Togelius, J., Salge, C.,
Levine, J., Renz, J. (2018, April). Deceptive games. In In-
ternational Conference on the Applications of Evolutionary
Computation (pp. 376-391). Springer, Cham.
Khalifa, A., Perez-Liebana, D., Lucas, S. M., Togelius, J.
(2016, July). General video game level generation. In Pro-
ceedings of the Genetic and Evolutionary Computation Con-
ference 2016 (pp. 253-259). ACM.
Perez-Liebana, D., Samothrakis, S., Togelius, J., Lucas,
S. M., Schaul, T. (2016, February). General video game
ai: Competition, challenges and opportunities. In Thirti-
eth AAAI Conference on Artificial Intelligence (pp. 4335-
4337).
Shaker, N., Togelius, J., Nelson, M. J. (2016). Procedural
content generation in games. Switzerland: Springer Interna-
tional Publishing.
Dahlskog, S., Togelius, J. (2014, August). A multi-level
level generator. In Computational Intelligence and Games
(CIG), 2014 IEEE Conference on (pp. 1-8). IEEE.
Adrian, D. F. H., Luisa, S. G. C. A. (2013, August). An
approach to level design using procedural content genera-
tion and difficulty curves. In Computational intelligence in
games (cig), 2013 ieee conference on (pp. 1-8). IEEE.
Schaul, T. (2013, August). A video game description lan-
guage for model-based or interactive learning. In Computa-
tional Intelligence in Games (CIG), 2013 IEEE Conference
on (pp. 1-8). IEEE.
Frade, M., de Vega, F. F., Cotta, C. (2012). Automatic evo-
lution of programs for procedural generation of terrains for
video games. Soft Computing, 16(11), 1893-1914.
Jordan, A., Scheftelowitsch, D., Lahni, J., Hartwecker, J.,
Kuchem, M., Walter-Huber, M., ... Preuss, M. (2012,
September). Beatthebeat music-based procedural content
generation in a mobile game. In Computational Intelligence
and Games (CIG), 2012 IEEE Conference on (pp. 320-327).
IEEE.
Kemmerling, M., Preuss, M. (2010, August). Automatic
adaptation to generated content via car setup optimization
in torcs. In Computational Intelligence and Games (CIG),
2010 IEEE Symposium on (pp. 131-138). IEEE.
Togelius, J., Preuss, M., Beume, N., Wessing, S., Hagelbck,
J., Yannakakis, G. N. (2010, August). Multiobjective ex-
ploration of the starcraft map space. In Computational Intel-
ligence and Games (CIG), 2010 IEEE Symposium on (pp.
265-272). IEEE.
Togelius, J., Yannakakis, G. N., Stanley, K. O., Browne,
C. (2010, April). Search-based procedural content genera-
tion. In European Conference on the Applications of Evo-
lutionary Computation (pp. 141-150). Springer, Berlin, Hei-
delberg.
Hastings, E. J., Guha, R. K., Stanley, K. O. (2009). Au-
tomatic content generation in the galactic arms race video
game. IEEE Transactions on Computational Intelligence and
AI in Games, 1(4), 245-263.