4 minutes
SillyCTF 2025 After-Action Report
Introduction
Yesterday, five months of planning culminated in SillyCTF 2025: the first capture-the-flag (CTF) challenge hosted by Penn State’s Competitive Cyber Security Organization. In this post, I will give a retrospective on my experience as competition organizer, discuss our development process, and share statistics and feedback from the event.
The Plan
This was our club’s first time hosting a competition for other universities and the general public. Thankfully, we had several great CTFs to use as an example, including PicoCTF, RITSEC CTF, and several CTFs hosted by the Cyberforce Program!
When we started planning, our first objective was to pick a theme. We settled on “SillyCTF” because it was general enough to allow creative freedom while still giving a framework for challenge authors to follow.
Another choice we made early in the design process was to separate challenges based on sub-themes. Rather than using the traditional CTF categories like cryptography, reverse engineering, and forensics, we picked concepts and memes from pop culture, including Minecraft, Ice Spice, and Costco.
We began developing challenges in November: our goal here was to create a range of problems with different difficulties, targeting both beginners and experienced players.
For scoring, I chose universal logarithmic decay (see “Dynamic B” here). In this scoring type, all challenges start at the same maximum value. Each time someone solves a challenge, its value decreases for all teams, which means that teams don’t gain an unfair advantage by solving easy challenges quickly.
To also incentivize solving difficult challenges, I added first-solve bounties to challenges that remained unsolved after four hours, and later released additional hints for challenges that were still unsolved near the end of the competition.
The Sponsors
The following companies helped make SillyCTF a success by providing our prize pool and funding for infrastructure. The total value of all the prizes was over $6000; thank you to these organizations!
The Competition
Players & Teams
- 313 players, 212 teams (1-4 players per team)
- Academic bracket: 78 teams
- Open bracket: 134 teams
- 60 teams solved at least one challenge
- 36 players were from Penn State
- Academic teams came from universities in the United States, Belarus, Canada, France, Greece, India, Indonesia, Malaysia, Nigeria, Pakistan, Vietnam, and Japan
Challenges
- 7 developers
- 38 challenges (19 easy, 9 medium, 5 hard, 5 insane)
- 9 categories (Costco, Minions, Ice Spice, Gallimaufries, Waffle House, Culture, The Serious Zone, Scratch, and Minecraft)
- 7/38 challenges were unsolved (TRAPPIST-1e, Grue, Purgatorio, Lax Logging, Cardboard, Deice the Spice, and Costco Membership)
- 6/38 challenges had only 1 solve (
:)
, LSD Simulator, Boom Meter, On The House, mc_OSINT_2.1, and ScratchSpy)
Timeframe
- 12 hours, 8 AM - 8 PM ET (Saturday, March 29, 2025)
- 2/3 of challenges released immediately
- 1/3 of challenges released at noon
- Feedback survey released at 2 PM
- Scoreboard frozen at 7 PM
Statistics
The Feedback
I incentivized feedback by adding a feedback form challenge, which awarded competitors with 200 points for sharing their opinions about the event.
58 competitors filled out the survey. They gave an “overall experience” score and rated their own CTF experience, and then rated SillyCTF in terms of difficulty, silliness, and fun.
Takeaways
What went well
- Feedback was positive: people liked the challenges
- The infrastructure stayed online
- The feedback survey was a good way to get insights
- Score decay meant that challenge points scaled properly
- Ticketing was effective for managing support
Room for growth
- Technology changes:
- Automatically announce solves in Discord with a bot like this one
- Automatically update CTFTime with a
/scores
integration - Let competitors have their own isolated challenge instances
- Use Compsole or a similar platform for this
- Prompt for ticket topics up front, record ticket transcripts with this bot
- General:
- Consider running the competition for more than 12 hours
- Clearly define hours for support over multiple days (ex: 9-5)
- Better flag validation with regular expressions
- More challenges
- More silliness
- Consider running the competition for more than 12 hours
Overall, this was a great experience. I really enjoyed seeing everyone’s work come together to make an international event go smoothly, and I’m looking forward to next year!