Solstice Cyclists part 2: data ingestion

Start with Part 1 to learn how I captured 4000 photographs of the mostly-naked, mostly-painted Solstice Cyclists.

Inadequate spreadsheet

My naive start was a spreadsheet with columns for what people wore, what they rode, and a description of their paint.  I used the row number of the spreadsheet as the cyclist ID & tagged images that contained a certain cyclist with that number.  I had columns for top, bottom, head, and face clothing.  That didn’t account for people wearing fairy wings, or sunglasses & a fake beard at the same time. Putting each piece of data in a separate column meant that I could search for “red” or for “sunglasses” but not for “red person wearing sunglasses”  So the spreadsheet was not expressive enough to capture the information & search was inefficient, so dupe-checking took a long time.

Database design

Instead of giving each cyclists clothing slots that could have either 0 or 1 items in each, I created a many-to-many relationship between clothes and cyclists. Each piece of clothing also had a “slot” attribute (top, bottom head, face, back, or other). So a cyclist could wear any number of items, and each item would keep track of where it was worn.  Cyclists & images also had a many-to-many relationship. Vehicle & Sex were simple enumerations.  Descriptions remained as plain text.

Spreadsheet to database.

Converting all the data in the spreadsheet to DB records let me remove any inconsistencies in how I entered the data in the spreadsheet, e.g. “wig, blue” or “blue wig”.  As I added clothes & vehicles to the DB, I searched & replaced those words in the spreadsheet with the DB IDs.  I had to be careful to replace only words in the appropriate columns, since the plain-text descriptions sometimes referenced clothing or vehicles. Sometimes I missed and found a description like, “Mostly red, wearing a green 73” which is quite confusing.

Once I’d replaced all the words with database IDs, I exported the spreadsheet as a CSV file and wrote a PHP script to ingest it into the database. I chose PHP because I’ve already done a lot of SQL with PHP for my Atlanta Fashion Police & convention gallery projects.  The script was pretty simple.  The line number was the cyclist ID. The first column contains an ID for Table X, the second column contains an ID for Table Y, and so on. My PHP server has a maximum execution time of 30 seconds, so I added parameters to the script to only ingest  100 lines at a time and ran the script multiple times. Since it’s a private PHP server that doesn’t have consumer traffic, I should have just increased the timeout, let the script run, then changed it back.

While building the spreadsheet, I had been tagging photographs in Lightroom with cyclist IDs. I exported the tagged photographs into a certain directory, then wrote another PHP script to iterate through all files in that directory, read the EXIF data, and fill in the images_show_cyclists table.

New frontend

This is my process for identifying cyclists going forward.  I look at an image in Lightroom and find a new cyclist who was not in the previous image.  I may scrub back and forth in the timeline to get a better view.  I fill in the search/create page to see if I have already seen a similar cyclist.

New “clothing” dropdowns are created as existing ones are filled in, so I can specify any number of clothing items. The “description” field checks for each word in order, so “blue yellow” matches both “blue & yellow stripes on arms” and “blue torso, red arms, black legs, goofy hat, yellow face”

Clicking “Find matching cyclists” will either show a list of cyclists with the features I’ve selected, or unlock the “CREATE” button if there are no matching cyclists.  Each matching cyclist is a link that takes me to a page that lists its features, what images it appears in, and previews one of those images.

Having a picture of the cyclist on the “view cyclist” page makes it much easier to confirm if I’ve actually found the cyclist I’m looking for, since I can just look between the two images.

The “EDIT this cyclist” page is almost identical to the search/create page, but instead of starting blank, it starts with data filled in from the DB.

Cyclists make multiple laps and groups tend to stick together, so if I see one cyclist back for a second lap, I can look at photographs from her first lap and identify some of the cyclists around her as well.

Preliminary data

I haven’t examined all the photographs yet, but here are some things I’ve discovered so far.

In 2012 photos taken over 50 minutes, I identified 1475 Solstice Cyclists.

Here’s a graph of how many passed over time.  Click to expand. 1 pixel vertically = 1 cyclist. 1 pixel horizontally = 1 second.  Red represents cyclists on their first lap. Green is the second lap. Blue is the third.  There are gaps when my view was blocked, the street was empty, I had to switch memory cards, and when traffic stopped & I paused the automatic camera.

The male/female split is 49/51, even closer than Dragon Con’s demographics, and very different from the split seen in most photographers’ galleries, in which images of women dominate.  Hmmmmmm. How curious.  HMMMMMM.

1300 people rode bicycles, which is to be expected from a group called the Solstice Cyclists, but I also saw:

  • 39 people on foot
  • 23 on inline skates
  • 6 on roller skates
  • 24 on scooters
  • 5 unicycles
  • 10 people on 5 tandem bikes
  • 7 skateboards
  • 2 pedicabs, with 2 drivers & 4 passengers

I also identified some groups & popular “costumes”

  • 11 giraffes
  • 45 mermaids
  • 8 Care Bears
  • 39 people wearing actual, normal clothes
  • 27 Wonder Women

I still have around 700 images to look through, so these numbers will change a bit, but as you can see from the graph, most of the cyclists in these later images are back for another lap, and there aren’t many new cyclists.

Once all that is done, I can start (START!) on the actual meat of this project: creating a grammar for bodypaint based on these thousands of examples & generating new paint patterns.

Solstice Cyclists part 1: data capture

The Solstice Cyclists, an intentionally-disorganized group of mostly-naked, mostly-painted cyclists who precede and overwhelm the Fremont Solstice Parade each year are one of my favorite groups to photograph.  They are colorful. creative, joyful, and high-energy.

Last year I decided I needed a photo of every single Solstice Cyclist.  (Does this seem familiar?)  I had two reasons:

  1. Statistics. Photographers’ galleries contain mostly women. Is this disparity caused by population imbalance or by selection bias? Solstice Cyclists are famous for being naked cyclists, but some people wear some clothing. How common is that?  What protective device is more common: bike helmet or sunglasses?
  2. Source data for grammar. I want to expand my bodypaint generator to use graphics, and I want the generator’s output to mirror actual paintjobs. Once I identify all the different cyclists, I can study their paintjobs, break them down into parts, and put those parts back together in novel but believable ways.

I got a tripod and a timer so one camera could automatically photograph everyone who passed while I did my normal photography beside it. I had to juggle a surprising number of factors to place that camera properly.

  • To avoid being blocked by spectators, the tripod needed to be either right next to the street, or high enough to shoot over their heads. I saw a few balconies, stout tree branches, even a bridge, that could get the needed height, but that brought new problems. Most paintjobs photograph best from the front, and bicycle riders tent to lean forward, so a camera that is too high has a bad angle. Also, accessing those high places is non-trivial, so I opted for a front-row seat.
  • Aiming down the street at approaching cyclists is my usual MO, but an automated camera will have trouble with that.  Since the camera is looking down the street, cyclists in the same image can be 10 feet or 100 yards away. How does the camera know which one to focus on, and which ones to leave blurry?  Cyclists in front will obstruct the camera’s view of cyclists behind them.
  • The route turns a few times. Maybe setting up at a corner will alleviate these issues. Setting up just after a corner sets a maximum distance at which cyclists will appear. Any further and they’d be in the crowd. There’s still the problem of cyclists approaching the camera and filling the frame, blocking other cyclists.
  • What about aiming across the street?  Cyclists will stay about the same distance from the camera as they cross the frame, and they are only 3 or 4 abreast, as opposed to unlimited ranks front-to-back, so obstruction is less of an issue.  Since I’m as far forward as possible (so spectators don’t stand in front of me) cyclists on the near side of the street will be very close. My lens might not be wide enough to capture their whole bodies, and they will cross the frame very quickly, maybe in between ticks of the automatic timer.
  • Thus, I decided to shoot across the street at the cyclists on the far side of the road. The frame is wide enough at that range that I’ll get several photos as each cyclist passes. Three-quarter to side view is not ideal, but still pretty good.  I had to accept cyclists on the near side sometimes blocking the shot, but it was the best I could do.
  • Oh, also! Position along the parade route matters as well.  The Cyclists circle back so they stay close to the parade (human-powered floats are much slower than bicycles). Near the end of the parade route, there are fewer spectators and no returning cyclists to block my view, but I only get one chance to see each cyclist, and some cyclists leave the route before then (mechanical failures, etc.) Closer the start of the route I get multiple chances to photograph each cyclist, but more obstructions.

The day before the parade I scouted the parade route, looking for places to set up.

I chose the spot on the right, which is near the “center of the universe” sign on the east side of Fremont Ave. The tree gave some protection to the tripod. It’s a lot easier to accidentally trip over a tripod than it is to walk into a tree.

During the parade I kept looking over at the “shots remaining” counter on the tripod-mounted camera like the marines watching the sentry guns in Aliens.  “That number is going down.  It, it keeps going down.  Are we going to run out before they stop coming?”  The automatic filled a 32GB memory card and I had to swap for another in the middle of the parade.  Whenever a traffic jam stopped the stream of cyclists passing me, I’d pause the automatic camera to save disk space.

In all the automatic camera captured 2644 images.  That’s equivalent to an entire day of Atlanta Fashion Police, except it took only 63 minutes, not 16 hours.  I took an additional 1400 photos with the camera I was holding.

I considered using computer vision to help me identify cyclists, but even nudity-detecting algorithms were bamboozled by the cyclists’ coloration. So I couldn’t even get “Yes, there is a person in this photo”, much less, “There are 6 people in this photo, and the guy with the red stripes and sunglasses has appeared in 3 other photos.” Time to use my eyes, the best pattern-recognizers I know! I thought I could store all the information in a CSV file. I’m only recording a few pieces of data for each cyclist, do I really have to make an SQL database with webforms to search and update it?

1064 rows later, I realized that, yes, I did need that DB.  Since cyclists could make several laps, and I was gathering data from both cameras, I needed to check for duplicate cyclists often.  Ctrl-F in a spreadsheet wasn’t cutting it.

Next time: building that database, and a few insights from the data.

PROCJAM 2017: Spaceship Wrecker

PROCJAM is a relaxed game (or anything) jam with the motto “Make Something That Makes Something”. It basically ran from 4 NOV 2017 to 13 NOV 2017, but the organizers emphasize not stressing out about deadlines.

Procedural generation is my jam, as you may have noticed from my Pathfinder Twitterbots and the little generators on my site.  I didn’t want to generate terrain, caves/dungeons, or planets, because so many generators like that already exist.  I had no shortage of ideas to choose from, though.  Deciding which one to pursue took quite some time!  Some potential projects:

  • Generate spaceships from subsystems that produce & consume various resources
  • Generate fantasy creatures with different senses & capabilities, and individuals of those races who may have disabilities or mutations
  • Generate buildings that accommodate multiple fantasy creatures with widely varying needs.
  • A MUD Twitterbot with emoji visualization
  • Generate footprints that players can follow, and a field guide that identifies the creatures that leave the footprints.

The generators I want to make have lots of constraints and dependencies. Many generators are stateless: the number of projectiles the gun fires can be chosen without regard for the projectiles damage, or fire rate, or elemental affinity.  Not so the fantasy creatures, who won’t use a spoken language if they can’t hear, or the spacecraft, who can’t use a science lab without a generator to power it.  I feel the added complexity in generation is worth it, because it forces the generated artifacts to make sense.

I chose “Spaceship Wrecker”, which generates a spaceship full of subsystems, then lets the player launch an asteroid or bullet to damage some of those systems and watch the failures cascade across the ship. In my mind I envision players boarding wrecked spaceships, prying open airlocks, re-routing cables, and getting the ship back online, but let’s start small, build up incrementally, and see how far I get in a week.

What parts do I need, and what do they depend on?

  • Engines (move the ship)
  • Fuel tanks (supply the engines)
  • Generators (supply electrical power to all kinds of parts)
  • Life support (supply air to rooms with people in them)
  • Crew quarters (supply crew to operate various parts)
  • Command/cockpit/bridge
  • mission systems (sensors, cargo, labs, etc.)

This gave me my list of resources:

  • Air
  • Crew
  • Fuel
  • Power
  • Thrust (technically that’s two resources: engines overcome inertia with thrust, but it’s simpler to create demand for engines by saying that parts consume thrust.)

I built some placeholder assets as Unity prefabs: 1-meter cubes, color-coded by function, with positive or negative resource values to indicate what they produced and consumed. At first I kept track of supplies at the ship level. If the need for power across all parts on the ship was X, I added enough generators to supply X power.  I didn’t care which generators supplied which components yet.  I would add some graph system later to distribute the resources.

I could specify a few components to start with, and the generator would add components until all components were satisfied.  Fun edge case: a ship with no components has no unmet needs, and thus is a valid ship.

Right after finishing that working algorithm that created sensible ships, I changed my data model & threw that algorithm out for a better one.  I made “ResourceProducer” and “ResourceConsumer” components to add to spaceship parts. Producers could form connections to Consumers, so each component knew how its resources were allocated.  When a component was damaged (remember the player-launched asteroid?) it could notify its consumers that the supplies were gone. Those parts would shut down, and their producer components would revoke resources from other components, spreading destruction across the ship.

Here a part has been hit by an asteroid (indicated by the green line) and it turns red to show its not working. Events propagate and three other components also shut down.  Success!

Let’s talk a bit about that asteroid. I imagine a tiny thing going extremely fast. Anything it hits is wrecked, and it penetrates to a significant depth. Multiple parts can go offline from the initial hit, if it’s lined up correctly.  I let the player orbit the ship with the camera, then click to launch the asteroid from the camera position to the cursor position. I RayCast to find the impact point, then spawn a trigger volume oriented in the same direction as the RayCast.  Spaceship parts know they are damaged when their colliders intersect with the trigger. I took a few tries to get the trigger volumes transform correct. I learned that some angle vectors contain Euler angles, so the 3 components degrees of rotation around each axis. Other angle vectors are unit vectors that point in the desired direction.

The ring structure was a placeholder for a more meaningful arrangement that I had been putting off because it was difficult. I wanted the parts clustered together because that’s how we envision cool spaceships and because the player could then line up asteroid impacts on multiple parts.  Parts should be connected by wires or corridors.  Parts that shared resources should be close together. Engines should be in the back. Fuel tanks should be far away from crew quarters. There were so many constraints I could place on the system!

I was also replacing my 1-meter cubes with low-poly models of differing sizes.  I tried spawning parts with space in between them, and using SpringJoints to pull them together, but SpringJoints maintain distance. I found a way to push parts away from each other, but that’s the opposite of what I wanted.

I thought about trying to place parts at the origin, seeing if they collided with anything, and pushing them to the edge of that hitbox if they did. I wasn’t sure what would happen once several parts were placed, and the first push might push the new part out of one part and into another.

I made a 2D Boolean array in which each cell represented a square meter that was either empty or occupied. As I spawned a new part, I’d get its size from its collider & try to fit a box of that size into the grid, starting at the center. If it didn’t fit, I pushed it in a random direction until it did. So my ships expanded from the center and all the parts touched each other.

But the parts only knew that other parts took up space. Related parts didn’t cluster together, and engines pointed their nozzles into bedrooms. Some algorithm research revealed that the “bin packing” problem was NP-hard, so I felt better about not immediately knowing how to proceed. I decided to sidestep the problem by rotating the engines so their nozzles were pointing down. All the parts were on the same 2D plane, so there would never be a part below an engine to get scorched.  I finished replacing all the placeholders with low-poly models and felt pretty good about my complex creations.

As a final step, I added another shader to differentiate between destroyed by the asteroid (bright pink) and shut down by system failure (dark red). I’m still looking to the future, when players go inside these ships to repair them.

So it’s done!  Basically. I should add some UI:instructions for how to interact with the ships. Of course, graphically indicating the connections between parts would be cool.  A spiral search is probably better than random walk for placing new components. A graph-based approach could improve co-location of related parts. It would be nice to have corridors for the crew to move through. Those could be hit by asteroids too, so each room would need airlocks. Are the airlocks dependent on main power to operate….?

Like I said, it’s basically done!

Pathfinder Bots: Simulation

@FightBot1 and @FightBot2 are Twitter bots that battle each other with randomly-generated level 1 Pathfinder Fighters.

Pathfinder’s combat rules are very complex, so I knew implementing the whole thing was impractical. I chose to exclude spells and skills, and as many special attacks and activated abilities as possible.  Thus I chose the Fighter class, the simplest class that just uses weapons.

Usually, Pathfinder has a Game Master, who has final say on anything that happened in the simulated world.  Players announce what they intend their characters to do, but the GM can modify, interrupt or ignore those actions when necessary.  When playing over Twitter, there is no GM, just the two players passing messages back and forth.  Thus, any action that interrupts another action, as well as hidden information that can affect the outcome of a player’s action, is no good.  That means anything that provokes attacks of opportunity (casting spells, firing ranged weapons, performing combat maneuvers, managing inventory, drinking potions, or even moving) was excluded.

Position and movement gave me trouble as well.  Pathfinder is based on a grid of 5-foot squares (actually cubes, when the game remembers the third dimension). Level 1 fighters can’t fly, so I could ignore height.  Should I simulate a 2d arena? Should it be a featureless square, or circle, or have terrain? What happens if a fighter runs into a wall? Into a corner?  Maybe a one-dimensional position, just a distance from opponent, would be sufficient to let ranged weapons, reach weapons, and normal weapons seem different.  If the fighters never take actions that provoke attacks of opportunities, they won’t get interrupted.  But knowing when a fighter is threatened requires knowing what the enemy is wielding. So I decided to only use melee weapons, and ignore positioning altogether. If one character has a reach weapon, just pretend that the the fighters are making 5-foot steps each round.

So fighters can only perform melee attacks.  What races are allowed, and what equipment and feats will they use? I used only items from the Core Rulebook, not the innumerable books released since.  The CRB has seven races.  Only feats available at level 1 that affect health, initiative, or melee attacks are relevant.  Fighters are proficient in all armor, shields, and simple & martial weapons, so those are in as well.

In subsequent blog posts, I’ll explain the procedural generation of characters & descriptive text, and how I integrated with Twitter.

Building a simulation: what is vs. what should be

When I find something that is fun to do in life, I want to make a game out of it so I can share the experience with others.  But when I closely examine the systems and rules that the world runs on, I realize how messed up they are, and that makes me sad.

I want to build a conflict resolution system where violence is optional and body language is significant.  I look for examples of tense situations that didn’t result in violence and remember some stories my friends told on Twitter.  I am sickened to discover that I’m about to gamify my friends’ trauma, because those stories were about street harassment.  I don’t want to mine the pain of people I care about for a game.  I don’t want to make a game that makes people relive that pain.  But if I accurately simulate human interaction, there will be situations where “don’t make eye contact & hope not to die” will be the ideal response, because those situations are common in real life.

So replicating the awful systems of real life seems cruel, but changing the systems seems dishonest.  All simulations are simpler than the real thing, so I’m required to choose some elements to keep and some to discard.  This is why people say that all games are political.  The game maker decides what parts of reality to consider important, or worthy.  Even if I don’t want that responsibility, I have it, because I can’t replicate a system completely.  Even if I make those choices unthinkingly, I’ve still made them.

Another example is cosplay photography.  I think a board game about managing time and energy while trying to do photoshoots during a convention would be really fun.  Photographers with different styles and goals could be different playable classes.  Seems good, but some photographers seek social capital at the expense of others.  Some exploit minors.  Some won’t shoot men, or black people.  Do I offer these as options for players to choose?  Do players want these options?

More subtly, the resources I picked for a photographer to manage are artistic fulfillment, friendship, and fatigue, because those are the most important factors to me when I photograph a convention,  But my priorities and experiences are not universal.  Other photographers have different priorities, good priorities, not the awful goals from the last paragraph, just different priorities.  So what should I include in my game?

If art is self-expression (that’s a whole blog post by itself) and the game is my art, then I should make systems that appeal to me. Sometimes that will make the fictional world operate the way I think the real world should operate.  Sometimes that will make the game operate in ways I think are mechanically interesting, without regard to real-life applications.

But if the art in games comes from player expression, then the players are limited to the tools i provide them, and I will deny some of them tools they deem important, since people are diverse and I can’t predict what everyone will need from my game.

Procedurally-generated bodypaint

It’s text, but NSFW text.  Procedural Paint-Job Generator.

This idea came to me in the wee hours of the morning.  I got out of bed, coded all morning, and went back to sleep after publishing it.

I’ve been creating a vocabulary to describe the body paint at the Fremont Solstice Parade for a while.   I planned to use that in some sort of database-driven visualization for Solstice Parade photos, somewhat like Atlanta Fashion Police.  I still plan to do that, but this project goes the other way.  Instead of describing an existing paint-job with the vocabulary, I use the vocabulary to create a description of a hypothetical paint-job.  The plausibility of the paint-jobs varies, but that’s part of the charm.I used Kate Compton’s Tracery to generate the descriptions.  I started by adding all the words I could think of, grouped logically into colors, color modifiers, patterns, animals, vehicles, and so on.  Then I built phrases that combined those elements, built the phrases into clauses, and then into sentences.  It can suggest individual paint-jobs as well as groups, and pluralizing complex phrases is tricky!  The following sentences have identical meanings, but must be modified differently to be pluralized.

  • green and bright yellow giraffe
  • green giraffe with bright yellow spots

Putting an “S” at the end of the phrase doesn’t always work.  There are also things that are always plural, like “roller skates”.

  • You get your roller skates.
  • You get your bicycle.
  • You rent roller skates for your team.
  • You rent bicycles for your team.

English is tricky!

Whenever the generator recommends a pattern, it may recommend two patterns instead.  Those two could also recommend two more, so there’s no guarantee the recursion ever ends.  Browsers have a lot of memory, text doesn’t take much memory, and it’s funny to get a big paragraph recommending 20 different patterns in a single paint-job, so I leave it in.

The paint-jobs produced by this generator are by turns absurd, practical, amusing, and shocking.  What more could a procgen system strive for?

Combat as a skill challenge

D&D 5th edition says it has three pillars: social interaction, exploration, and combat.  Combat is last in the list, but most of the rules deal with combat, not the other two pillars.  Indeed, most d20 games (previous editions of D&D, Pathfinder, etc) share this focus on combat. Yet when I hear people talk about their role-playing experiences and the characters they play, personality and story are more important.  Sure, we talk about the time the Fighter dove into a pack of Ghouls and got paralyzed, but the point of that story is the fighter’s hubris, thinking she was untouchable, not the +2 bonus from flanking or the DC of the paralysis attack.

In this post, I suggest a hack that makes combat work like the other two pillars, thereby excising most of the game.  This makes the “three pillars” equal in complexity, and is also pretty boring.  This is how two of the three pillars of the game have been treated for decades.  The rules for social interactions are so simple that most parties have a guy who does all of it, in addition to his combat duties.  One of the pillars of the game, the part that most people remember when they talk about their games, is a part-time job for one out of four or six people.  Fortunately, humans are good at telling stories, and GMs have been picking up the slack, and games are full of great social interactions completely unsupported by the rules.

In summary, since people care most about their characters and stories, shouldn’t the game focus on those elements first?

Combat as a skill challenge

Add three new skills:

  • Combat (melee) (Str)
  • Combat (ranged) (Dex)
  • Combat (magic) (Int)

Each class gains one or more of these skills as class skills, and gets 1 or 2 more skill ranks per level.  All Combat skills may be used untrained.

Some classes get class features to change the ability score associated with the skill.  For example, Rogues use Dex for melee combat, and Clerics use Wis for spell combat.

Combat is a series of opposed skill checks.

Check: You can change the battle state of nonplayer characters with an opposed check.  The opponent receives a bonus on the check based on its current battle state.   If you succeed, the character’s battle state is decreased by one step. For every 5 by which your check result exceeds the DC, the character’s battle state  decreases by one additional step. A creature’s battle state cannot be shifted more than two steps up in this way, although the GM can override this rule in some situations. If you fail the check by 4 or less, the character’s battle state is unchanged. If you fail by 5 or more, the character’s battle state increases by one step.  If the enemy’s battle state reaches “Defeated”, it can no longer participate in the battle.  It is slain or captured, your choice.  If the enemy’s battle state reaches “Victorious”, you are incapacitated and can no longer participate in the battle.

Most enemy creatures start with a battle state of “even”, although circumstances like favorable terrain or surprise may change this initial state to “winning” or “losing” at the GM’s discretion.  A difference in power between you and your opponent also affects the opponent’s initial battle state.  Subtract your character level from the opponent’s CR and divide by three.  Increase the opponent’s initial battle state by that many steps.  (This will decrease the opponent’s battle state if you are more powerful.)  This may change the opponent’s battle state to “Defeated” or “Victorious”, which means the battle is over immediately.

Defeated              --
Routed                -10
Losing                -5
Even                  0
Winning               +5
Dominating            +10
Victorious            --

Action: a combat check represents one minute of combat.

Try again: You may try again until the opponent’s battle state is “Victorious” or “Defeated”, at which point the battle is over.

Special: allies may use the “Aid Another” action with any Combat skill.  For example, a Fighter making a Combat (melee) skill check may be aided by a Bard making a Combat (magic) check and a Ranger making a Combat (ranged) check.

Jam, the Whirlwind Spear

While building “#1 Sap Master” I lamented that there were no finesseable reach weapons, but one does exist!  The Elven Branched Spear is not only finesseable, but gets a +2 bonus to attacks of opportunity.  So I built a Monk around it.  I didn’t try for the Flowing Monk this time, just an Unchained Monk, which is basically Monk 2.0.

The reach weapon (acquired through Ancestral Arms) and her high Dexterity gives her lots of AOOs, and Panther Style also gives her a pool of “retaliatory unarmed strikes” that she can use each round by provoking AOOs from her enemies.  Flying Kick lets her move during a Flurry of Blows, so she can take all 3 attacks on her turn, move past enemies and retaliate when they take their AOOs, then take her own AOOs if the enemies she left behind try to close in again.  She doesn’t hit hard, but she hits often, growing more dangerous as she faces more foes.

I named her after Jam from Guilty Gear, who fights with quick strikes, flying kicks, blazing fast dashes, and a distinctive “HOOOO!” battle cry.  She’s basically that, but with a spear.


Female Half-Elf Unchained Monk 7
N Medium humanoid (human, elf)
Init +7; Senses darkvision 60ft., Perception +8


AC 25, touch 21, flat-footed 20 (+1 Armor, +5 DEX, +4 WIS, +1 monk +1 deflect +3 natural) +4 vs AOOs
hp 64 (7d10+28)
Fort +9, Ref +12, Will +4 (+2 vs. enchantment, +2 vs charm & compulsion) Immune: disease


Speed 50 ft.
flurry of blows unarmed strike +12/+12/+7 1d8+5
+1 elven branched spear +13/+8 1d8 x3 P (brace, reach, +2 attack on AOOs)

Special Attacks Stunning Fist 7/day FORT 17 stunned 1 rnd OR fatigued 1 min


Str 10, Dex 20, Con 14, Int 9, Wis 20, Cha 7
Base Atk +7; CMB 7; CMD 27
Feats Combat Reflexes, Dodge, Exotic Weapon Proficiency (elven branched spear), Improved Unarmed Strike, Mobility, Panther Style, Panther Claw, Panther Parry, Stunning Fist, Weapon Finesse
Skills Acrobatics +13, knowledge (history) +3, knowledge (religion) +3, Perception +8, Sense Motive +10, Stealth +13
Languages Common, Elven
Special Qualities

  • Ancestral Arms: Exotic Weapon Proficiency (elven branched spear)
  • Blended Views: Darkvision 60 ft.
  • Evasion: no damage on successful Reflex save.
  • Ki pool: 7 points
    • spend 1 point: gain 1 attack at full BAB as part of full attack
    • Sudden Speed. swift action, 1 ki point: increase base land speed by 30 ft. for 1 minute.
    • Barkskin: standard action, 1 ki point: +3 natural armor bonus for 70 min.
  • Ki strike: unarmed attacks overcome DR for magic, cold iron, and silver
  • Style Strike
    • Flying kick: During flurry of blows, move up to 20 ft. (provoking AOOs as normal), ending adjacent to a foe and kicking it.
  • Combat Reflexes: 6 AOOs per round
  • Panther Style: When you provoke an AOO by movement, make a retaliatory unarmed strike against the creature making the AOO (limit 4/round). If you damage the creature, its AOO takes -2 on attack and damage.

Traits: reactionary (+2 initiative), focused disciple (+2 saves vs. charm & compulsion)
Gear: +2 cloak of resistance,+2 Belt of Dexterity, +2 headband of Wisdom, agile amulet of mighty fists, +1 bracers of armor, +1 elven branch spear, +1 ring of protection, handy haversack, monk’s kit, ioun torch.

Three-armed fighter

In my last post, I said that Triali could be just as effective if she were a two-handed fighter, so I built a two-handed fighter who uses her third hand to hold a tower shield.  She’s Triali’s half-orc half-sister.  The Two-Handed Fighter archetype makes two-handed weapons hit even harder.  Her Alchemist levels also grants her mutagen and a few spells like enlarge person that let her hit harder and control more area.

Chely Temminck

Female Half-Orc Two-Handed Fighter 5/ Alchemist 2
N Medium humanoid (human, orc)
Init +4; Senses darkvision 120ft., Perception +?


AC 28, touch 15, flat-footed 27 (+10 Armor, +1 DEX, +6 shield +1 deflection)
hp 68 (7d10+28)
Fort +14, Ref +10, Will +7


Speed 20 ft.
MW adamantine Lucerne Hammer +13/+6, 1d12+14 (x2) B or P
MW halberd +13/+6 1d10+14 (x3) P or s
MW cold iron Orc double axe +11/+4 1d8+13 (x3) S
MW alchemical silver Orc double axe +11/+4 1d8+12 (x3) S
MW composite longbow +9 1d8 (x3)
Special Attacks Overhand Chop, Shattering Strike

Alchemist Formulae prepared:
level 1 (DC 12) expeditious retreat, enlarge person, enlarge person


Str 20, Dex 14, Con 16, Int 12, Wis 10, Cha 7
Base Atk +6; CMB 11; CMD 23 (+6 vs. Bull rush & overrun)
Feats Combat Reflexes, Furious Focus, Iron Will, Mobile Bulwark Style, Mobile Fortress, Power Attack, Shield Focus, Weapon Focus (Lucerne Hammer)
Skills 18 ranks sraft (alchemy) +11 (+5 to create alchemical items), linguistics +2, spellcraft +6
Languages Common, Orc, Elven, Giant
Special qualities

  • Missile Shield: once per round, when a ranged attack would hit you, deflect it harmlessly.
  • Vestigial Arm: a third arm to hold the tower shield
  • Overhand Chop: add 2*STR instead of 1.5*STR when making a single attack
  • Mutagen: 20min duration. +4 to STR or DEX or CON, -2 to INT or WIS or CHA, respectively
  • Shattering Strike: +1 to CMD and CMB on sunder attempts. +1 damage against objects
  • Combat Reflexes: 3 AOOs per round
  • Sacred Tattoo: +1 luck bonus on all saves
  • Fate’s Favored: increase all luck bonuses by 1
  • Reactionary: +2 initiative
  • Weapon Training: +1 attack and damage for two-handed polearms
  • Dragon Sight: darkvision 120 ft.
    Traits: fate’s favored, reactionary
    Gear: +1 full plate, +1 tower shield, +1 ring of protection, +2 cloak of resistance, +2 belt of strength, MW adamantine lucarne hammer, MW halberd, MW cold iron/alchemical silver orc double axe, MW composite longbow, 20x arrows, 20x blunt arrows, cracked pale green prism ioun stone, ioun torch, 50 ft. Rope, fighter’s kit.

Triali the three-armed scarecrow

Triali’s not an effective character.  Her average damage is the same as a fighter with a greatsword and Power Attack + Furious Focus.  That fighter also get heavy armor and seven feats to spare.  Worse, Triali’s damage is unpredictable, as seen in the graph above.  Maybe those huge damage spikes instantly kill a boss, or maybe they overkill some mooks.  That’s what makes her exciting to play, watching the pieces interact and waiting for the explosion!

The center of the build is Butterfly’s Sting, which allows a character to pass a critical hit to an ally.  So one character uses a weapon (or dual-wields two weapons) that has a high crit range to maximize the number of critical hits, and another uses a two-handed weapon with a high critical multiplier, to maximize the damage of those critical hits.

Here’s the twist: a character counts as her own ally, and with a third arm from an Alchemist discovery she can wield both a kukri (which crits often) and a scythe, (which crits really hard).

She’s a Ranger so she can focus on Strength and get Two-Weapon Fighting feats through her combat style without meeting the Dexterity Requirements.  That also means she gets an animal companion.  Since she carries a scythe, I gave her a big raven and a scarecrow aesthetic.  She does the samurai thing of shrugging off one shoulder of her jacket when preparing to fight, not just to give her left arm freedom of movement, but also to reveal her other left arm!

In the future she’ll get Power Attack, to make her hits bigger and less predictable.  She’ll enchant the kukri with keen, so it crits more often, and add thundering and shocking burst to the scythe, so it will literally crit like a thunderbolt.


Female Human Ranger 5/ Alchemist 2
N Medium humanoid (human)
Init +2; Senses Perception +11

AC 21, touch 13, flat-footed 21 (+7 Armor, +2 DEX, +1 deflect +1 natural)
hp 56 (10 + 6d8+20)
Fort +11, Ref +11, Will +6 ( +2 Will vs. divine spells)

Speed 30 ft.
Melee +1 kukri +11 1d4+3 (18-20/x2) +1 scythe +11/+6 2d4+8 (x4)
Ranged MW composite longbow +8 1d8 (x3)
Special Attacks favored enemy (humanoid(human) +4, undead +2)
Alchemist Formulae prepared:
1. (DC 12) shield, long arm, enlarge person
Ranger Spells prepared:
1.  (DC 12) longstrider

Str 20, Dex 14, Con 14, Int 13, Wis 12, Cha 7
Base Atk +6; CMB 11; CMD 23
Feats Butterfly’s Sting, Combat Expertise, endurance, iron will, two-weapon fighting, weapon focus (kukri), weapon focus (scythe)
Skills 45 ranks: Perception +11, Stealth +11,  Swim +9, Climb +9, Knowledge (nature) +11, Knowledge (geography) +11, Handle Animal +8, Perform (pipe) +4
Languages Common
SQ Vestigial Arm, Hunter’s Bond: Raven (Daszo)
Traits: anatomist, disdainful defender
Gear: +1 Mithral Breastplate, Cloak of Resistance +2, Ring of Protection +1, Amulet of Natural Armor +1, Belt of STR +2, +1 kukri, +1 scythe, MW Composite Longbow, 20x arrows, 20x blunt arrows, ranger’s kit