Tegami

Last edited: 18 May 2020, 1:43PM

Info

This was a summer osu! storyboard made by me and my boi Royce, based on Nekomata Master's song Tegami. The central effect was made using rotoscope animation, then accompanied by various dot effects.

Background

Bringing me out of the collapse of Crusaders, I decided to switch gears back to normaller storyboarding. This was after Royce said he was interested in doing another project with me, with a few key changes. This time, he was interested in coding alongside me. I guess sometime throughout the year, I honestly have no idea how or why, he suddenly decided to pick up programming as a hobby.

I had been debating whether to work with Royce again. I've always wanted to go for rank, and with Royce being on the laxer side of existence, that was probably never going to happen. But getting the chance to work with him closer/closely on the code changed that equation. Now after the fact, I'm pretty glad to have done what became a personally reinvigorating and remotivating project.

Crusaders left me in a pretty sour spot. I think through the thick of it all, I was just, hmm, lonely might be the best word. I had no one to show/talk about the project with. It was a difficult concept that required a lot of my energy to learn and fix. Eventually, without making much progress, laziness and pressure bottled up until I was sick of the project and wanted move on. I'm glad I did. I don't know what state I'd be in if I continued working on it.

Working with Royce was a breath of fresh air, and I enjoyed his company. I value his opinion, personality, character. He's not the most feverish of workers, though I mean, I think it does take a certain amount of cuckoo to match the sacrifice I give for my personal projects. But for someone who's only had less than a year of C experience, suddenly jumping into C++ and using libraries for such a niche thing as osu! storyboarding, I think Royce did a consistent, great job, and I'm proud and thankful of his contributions.

Ok, let's jump into how this project came about. The first thing, as always, was song selection. Really, the only criteria Royce wanted was something loosely concretely "Hey man, let's do Nekomata Master." I said, "Okay."

Nekomata Master, I'm not too sure how I feel about him, even now. I'm not very into folksy songs, and Nekomata has quite a lot of that. Later, as a kind of fun side note, we found out he generally composes under the "World" genre, which, as I joked, was certainly superior to Country. I don't really get it, but World songs are supposed to contain a mix of instruments and sounds heard around the World.

To be honest, I can probably name all the Nekomata songs I like, and I think I'm just going to do that below. Most songs I've found are probably all through some form of osu! exposure. I don't own any of his albums, though I kinda wanna get some now after working on this project. As an exception, I think Tegami is very good, one of my favorites in fact. I'm still not too tired of Tegami somehow, though maybe somewhat jaded. Through osmosis I guess I'm obligated to like Nekomata because Tegami is just so good.

I prefer the original over kors k's remix, but the latter has a cool storyboard by Damnae

Camellia's remix is great too, but I think the original feels more "real"

Not gonna lie, I just found this in YouTube recommendations

Royce wasn't totally unbiased and had his preferences for songs. His main choice was Indigo Bird, though he offered Tegami as an alternative and said any other Nekomata song was probably fine. I wasn't a huge fan of Indigo Bird, but I think it's been growing on me. I feel like the song's just a bit too folksy for me and lacks some kind of diggin it factor, I don't know.

If we chose Indigo Bird, I feel like our storyboard would have been quite a bit simpler to put together. There's a very concrete theme built right into the title, bird, as well as a color, indigo, to play off of. I feel like with just those two things alone, we could fill an entire storyboard with enough varied content.

Part of my song selection process involved planning as much as I could for it, so I tried to think of various effects to put in. For Indigo Bird, I think flocks of bird in various shapes and patterns would look pretty nice. I also wanted to put some of my art skills to use and maybe draw some cool looking birds. For the ending, I wanted to do something more complicated like a flapping 3D bird.

I spent like an hour figuring out how to download this from reddit with ffmpeg

Tegami, sonically, is definitely my preference and favorite compared against Indigo Bird. I just really like the song. There's also a bit of history with Tegami between Royce and I. A year or two ago, when plug.dj had its stint in our osu! UCI club, Royce and I would occasionally be in a lobby together and share music with each other. Every now and then he would play his Nekomata playlist, and two particular songs stuck out to me: ワタリ草想 and Tegami.

I'm going to be honest, I have no idea what ワタリ草想 romanizes or translates to. I just refer to it as the Purple Flower 6 Minute Song. After he shared them to me, I listened to Purple Flower 6 Minute Song and Tegami over and over the next couple days and probably weeks. My preference eventually shifted to the latter, as it was shorter, with more going on, and I don't know, more powerful and more resonational with my soul I suppose.

So when Royce brought up Tegami as an option, and I remembered how good it was, and I was like, damn, I really want to make this work. But making it work was kind of difficult. "Letter" as Tegami translates to, was not something very friendly to work with. Envelopes weren't really interesting by themselves. We could do something word based maybe and make the storyboard very lyrics focused.

One cool effect I had in mind that never made it into the storyboard was a sea of words. Characters would splatter the screen, and through color selection, we'd highlight certain characters that reflected what was being said in the song. I think this was slightly inspired by some scenes (see trailer below) from Fune wo Amu, a show that showed a lot of seas of words. As for the show itself, uh, I think it's kind of average. It felt very sincere and the concept was kind of interesting, but I don't know, it was just okay, for me. 6.669 out of 10. Hey, at least I finished it.

Well, okay, obviously the song is more than just the title, what about the lyrics, what can you draw from that. Okay, here's the thing with Nekomata. I'm pretty sure all his vocal songs, well you know what, let's just say all his songs, are all generally kind of sad and depressing and reminiscing and regretful and stuff, you know what I'm saying. That's the conclusion that Royce and I came up with, but/and that's also why we like his songs.

Royce wrote his own translations for the songs mostly for his enjoyment, I think, but also so that I could get a better understanding of what was going on. There were rougher translations made by a Nekomata fan online, but with Royce, I could ask him personally what his interpretations were. Spoiler though, I still don't really know what's going on.

Indigo Bird... hmm. After reading Royce's translations a few times, I'm kind of convinced that it's about longing and regretting a lost connection so much, that the speaker is considering suicide. Or something like that. Maybe that's a little extreme, and I think Royce thinks so too, but that was my general gist of it.

Tegami was also pretty abstract. I'm convinced somebody's died in here, specifically whoever the speaker, hmm I see a trend, is longing and reminiscing about in the song. I think it's about remembering a past special someone and wanting to say some final words to him, probably, but she can't, so she's sending a letter to the beYond instead.

So we kind of have a problem. I don't think it's easy to draw effects based off of sadness and things that aren't very concrete, or at the very least, I had difficulty with this. I wanted to make Tegami work though, so I started thinking about more general effects we could apply rather than work from just the song alone.

I was very close to just calling it quits and doing Indigo Bird, but then I thought of a storyboard effect I've always wanted to do: animation. This came about because I wanted some way to show writing, like physically writing a letter or message. Funny how this never actually made it into the storyboard though. But extending the thought from there, I started thinking of general animated scenes to take up the full length of the song.

I didn't know what exactly I'd fill the all the scenes with just yet, but I was confident I could make it work. And with that, I optimistically declared that we'd work on Tegami over Indigo Bird, with the animation as the centerpiece. Royce was kind of hesitant I'd say at first, and I think he was unsure what I was even imagining in the first place (that happens a lot), but I'm sure I won him over after showing him early versions of my rotoscope utility and seeing some sample animation come to life.

I don't know if this is rotoscope, but this guy is cool

Great song too

I suppose it's a good time to talk about rotoscoping now. If you don't know what it is, it's a process in creating animation by drawing over a real life video. Going frame by frame, you draw and color each frame of the video, and the goal is that by the end, you can remove the video completely and just string together all the frames you've drawn into (hopefully) fluid animation.

Rotoscoping has a very particular/strange kind of look associated with it. It feels very different from traditional Japanese anime, American animation, and CG. Many critics comment on how ugly and realistic it is, and the beauty of that, since it draws literally so directly from real life. Of course, you can add in your own special effects, but since it's so closely based on real life, there are always lingering details that make it eerie in some ways.

If you're curious, I highly recommend watching Aku no Hana, a rotoscoped and controversial anime that I decided to finally pick up about midway through this project. Damn did it blow my mind. It's one of my top favorites right now, and wow, I don't know, everything in it seemed to work for me. I connected well to the main character, enjoyed the slow, ambient pace of the show, and actually really loved the quirky animation and sound.

I was going to show a calm scene from the show that didn't spoil anything, but I can't find one at least readily available on YouTube. So instead, enjoy some of the dynamic music selections used throughout the show.

OP

ED

OST

The reason I chose to do rotoscoping wasn't due to its eeriness, but more because of skill and time constraints. I have a little basic bit of art background, but I figured with little animation or animation tools experience, it would take way too much time for me to draw 4 minutes of animation from scratch.

Let's do some simple math to see how intensive drawing this animation is. Tegami was drawn approximately at 5 frames a second. So 5 frames a second for a little above 4 minutes of work totals to something like 1250 frames. If each frame takes 5 minutes to draw, that's over 100 hours of work, on drawing alone.

You can't escape those 100 hours. Most of my summer nights were slaved away to doing that, but at least with rotoscope, I had a video basis to work off of and didn't have to rely on my nonexistent animation knowledge to carry my weight. Of course, there were other sources for time as well, such as making and updating a rotoscope utility, planning and recording the video, editing the final video frames, etc.

I love surreal memes and I hope you do too

It's actually kind of interesting thinking back on this now. I guess I didn't need to make a rotoscope utility. I'm sure there are things out there that exist that do this kind of thing already. But I suppose for experience and for finer control, making it from scratch seemed like an interesting project, so that's what I did.

With all this in mind, I committed to rotoscoping the entire song duration. Sure, I could have cut some corners off, maybe not draw for the entirety, but I feel like if the animation was going to be the central thing, it's gotta be the central thing for the whole thing. Since this would likely be the most time intensive and complicated section, I personally handled and oversaw all the animation coding, drawing, production, etc. Royce, who was very fresh to C++ on the other hand, would focus on simpler accompanying effects to add in the background and foreground.

And then we got to work. Royce spent his initial weeks learning about C++ and getting used to my storyboard library. My initial goal, before we started working, was to create a complete outline detailing all our desired scenes and effects, so we had a rough guideline to go by. Here, I focused on storyboarding (yes, actually storyboarding) the video that we'd planned to use. Meanwhile, Royce would come up with a variety of effects to place in the background.

The original concept for the animation sprouted from the idea of animating someone writing a letter. We never incorporated this in the actual animation as mentioned, though it did make an appearance into the background image of the map. To make my life easier, I wanted to try and keep the video as simple as possible. Use as few props and people as possible. No background drawing at all. All background would be ideally handled by Royce. I just needed to focus on drawing the centerpiece.

This led to animating a single girl, representing the singer, moving and interacting on camera. The only problem about girls is that I don't actually know too many them. who would be interested in modeling/acting for this kind of thing. Thankfully Shannon, who hangs out around the osu! club often, and I exchanged some words, and she was gracious enough to help us out.

For the next few days, I broke out the Photoshop and drew a bunch of rough storyboards for scenes. I've reproduced them below for reference. They're pretty basic, but it was pretty fun and interesting working on them. It's always been a kind of far off dream of mine to work on an anime/animation project, and I suppose I can cross some of that off with this project.

1. Waking Up

2. Walking

3. Running

4. Closeup

5. Lips

6. Stand Up

7. Turn

Let's dive into some of the storyboard details here. I listened to Tegami many times over, and the first thing I wanted to do was match movement to the pace and intensity of the song. On slower sections, the model would walk slower, and on faster paced sections, she would pick up speed. To better show motion, it was only natural to show things in profile view.

The first scene, waking up, was inspired by Shannon actually, who is well known to take many generous naps. In her own words, paraphrasing, "Um, I don't know how much I can help, but I'm pretty good at sleeping." So we put that in. I think it's a good introduction to the song. Waking up, I don't know, kind of symbolizes the own viewer's introduction to the song as well.

I think the camera rotation in the beginning was something small, but pretty smart. Instead of having the model lay down and pick herself up to signify waking up, we just had Shannon stand still from the start and rotate the camera to swing her into view. Less logistics and hassle involved. Then from there, we zoomed out to incorporated her whole body for walking and running.

The walking section is purposely bland, so that I could leave room for Royce to add in his effects and share some of the limelight. Well, not going to lie, I also couldn't think of enough activity to fill out the entire song, so you have these pretty boring animation sections. I did try to add some ideas for micro movements to make things a little more fancy, something like skipping around, hand placement, expression, etc., but it's not enough.

To match the lyrics and take advantage of rotoscoping, I wanted a closeup to the lips singing the words. This would look best from the front, so I needed a way to transition from profile to front facing view. Fortunately, there was a long, calm section in the middle where I had the luxury of time to spin around and zoom the camera closer.

The section after the lips was a dramatic, lengthy, get up off the ground and move closer to the camera action. Not too much is going on here, like the walking scene from earlier. To be honest I think it's kind of weak for the climax of the song, but Royce's tree effects were able to carry some of the weight and spice things up to a better notch.

The ending of the storyboard was actually based on the last few song lines. In romaji, "Mitsumete, anata to aruita kono michi," and translated, "Look this way, the way we walked down this path." Naturally, I felt this lends itself to one of those kind of cliche, walking out towards the horizon, on a path, kind of scenes.

Somehow, the idea of Okaerinasai snuck itself in there. I still listen to the song from time to time, so I probably saw the familiar artwork (see below) attributed to it and connected it back to the storyboard. I don't know much about the artist or any history of the image. The first place I saw it was from Azer's now old Okaerinasai collab.

Source

I wasn't sure how doable it would be to fade in small pieces of the image into the background, but I had the idea at the time to fill the background in like a mosaic and eventually finish at the pivotal looking back moment. The actual image and the end result don't match up perfectly. To be honest, I could have done some more editing to make the two better matched, but it's all alright I think as it is now.

I'm kind of divisive about this Okaerinasai section. Judging by general reaction, I think most people thought this ending was probably the best section, and as most of them players, appreciated the reference to Okaerinasai. Personally though, I thought this section was kind of cheesy and didn't fit the rest of the song well, so I was somewhat surprised this portion got the bulk of attention.

Kenny, after reviewing the storyboard, also remarked how the girl was kind of just getting smaller, not really walking towards the distance, and I kind of see where he's coming from. But I suppose nostalgia and historic value win out, and it was a fine scene to end on. I guess looking at it now, it's a bit ominous I suppose too, with 11t, the storyboarder for Okaerinsai, having pretty recently passed away, and coping with a loved one's death as my interpretation for this song.

Okaerinasai was one of the first songs that really got me into osu! and eventually to storyboarding, so I have a lot to thank for that. It's a song I love, and as a whole, I think it's an amazing map and an amazing idea of a map. Here you have a remix, of a song, made by 4 different mappers and a storyboarder, put into osu!. Just thinking of all the collaborative hands that touched on this single song and project... it kind of warms my heart, if you know what I mean.

I could never DT this

11t, hmm, is someone I don't know too much about. Besides a few storyboards he's worked on, I only know that he passed away recently. I never got a chance to talk with him personally, and I don't think he really interacted that actively in the osu! community outside of mapping and storyboarding. Well, Haoming, an actual god gracing this mortal Earth, said that his EOS storyboard is the best of all time, and I still never really figured out how he did the Okaerinasai arrows, so Mr. 11t, you get an A+ in my book.

And that was pretty much it in terms of storyboarding the storyboard. Of course, scenes shifted around and changed over the course of the project, as is all things in pre-production, but the general idea remained. Any problems, as would commonly be said between Royce and I, would be fixed in post-production.

There was going to be some time before we could get a chance to record the video, however. Right then, it a week or two before Anime Expo, and our group of club friends were all volunteering at the osu! booth. The planning and work required for that was pretty exhaustive, so we decided to delay videoing until some weekend after the event.

In the mean time, I would start on writing a few utilities for the project. There was the rotoscope program to make, of course, but before that, I wanted to make another utility, for Royce. After reviewing the video storyboard and having some time to think of effects, one of the effects I wanted was a way to draw lyrics, but in a particular way: "stroking" characters step by step as if they were written by hand.

To make these lyrics, I made a utility to record strokes, using VB.NET and WinForms. I chose these two because they're what I use at work, and I thought it was a good opportunity to get more exposure in them. How the utility works is pretty simple. You draw strokes down on a fixed sized canvas and all the point and line information are saved into a custom file format. Order is preserved so you know which stroke comes before/after another.

It's been some time now, so I've forgotten the various details in its coding I had to deal with. This was my first time working with the Tree control, so I suppose it was interesting getting a bit used to that. You can do some pretty interesting things with the Tree in the utility, dragging nodes up and down to rearrange strokes for instance. For the most part, everything here was smooth sailing and pretty comfortable to work with.

On the storyboard code side, you'll need to read these custom made files in, following the special file format to get all the line info. Then you can use this information to finally draw out your characters. I handled much of this parsing since Royce wasn't very well versed in C++ and its libraries. He would work on the drawing code later on.

Royce was living with me during this time, for a little less than a month. It was a good opportunity to unload everything I knew about programming and storyboarding to him and help him get a further head into coding. I don't know if I'm the best teacher of these sorts of things though. Royce, among others, and maybe even you as well, have remarked that I tend to be too verbose and long winded with my explanations. Information, I imagine, gets lost as I try and convey it too thoroughly.

I don't know. I think I try my best, but I'm sure Royce was still lost on some concepts I tried to teach him. Classes and data structures, in particular, I know he struggled with. I have my own failings, beyond just teaching skill. I'm also lacking in concrete knowledge. I don't know the nitty gritty of all that goes on, and I often can't answer with 100% assurance. I'm working on that in my free time, reading more programming books and trying to learn more in general.

Anyways, back to the utility, in the first iteration, I only planned for the lyrics to be made up of straight lines, not, as you'll notice, the dot curves that the final video has. At the time, I thought it would be fine if the lines were just straight, jagged strokes, but Royce was pretty reluctant to leave it at that.

Much later in the project, when Royce finally transitioned to work on the lyrics again, he reiterated that he wanted to update the strokes so that curves were supported. Fortunately, I had some actually working spare code on Crusaders handling Bezier curves that was already finished, so we ported that to our storyboard and utility code. This process was relatively smooth, save for one or two minor bugs, and after some fiddlin around, it was all good.

I say fiddlin, because neither Royce and I know that much about Bezier curve mathematics outside of some very introductory and simple concepts gathered from the first few chapters of this online guide. To do slightly more complicated things like measuring how many dots to draw per stroke and at what distance away from each other, we measured rough estimates using discrete steps of the Bezier curve.

Per his needs, I updated the utility to support dot curve drawing. I added a few checkboxes and sliders so that you could have more control displaying all the anchors/dots. I added a selection tool to allow you to select lines and anchors on the grid. Below is a screenshot of the final utility, with a character stroked out.

Royce tested it out, and I think he was impressed, for the most part. Well, we did immediately find some bugs as he tried doing some character work, and I cranked through those. Most of these issues were pretty small things. I think the most difficult bug to crack down on was something dealing with perfectly horizontal or vertical lines. There were a few problematic things still remaining to this day that I couldn't reproduce/fix, but Royce said there were simple workarounds for these.

After the first iteration of the stroke utility, I worked on the rotoscope tool. This tool didn't need to be too overcomplicated. I just needed some way to play a video frame by frame, and over each video frame, manually draw lines and color in space, that's it. My first attempt to do this was using WinForms, and unfortunately, I quickly found myself at some dead ends.

First is the problem of playing video. There isn't a really completely straightforward way to do this. Looking online, it seemed the easiest, recommended way was to jam in a Windows Media Player plugin and use that to play video. Using WMP is pretty janky, not gonna lie, but it worked and I suppose it was bearable. The problems mostly came after that, however.

So for some reason, I could not figure out how to overlay lines and color blocks on top of the media player. I couldn't draw this on the main panel, else it would go behind the media player. I also couldn't make a panel on top of the media player and draw on that, otherwise the whole solid panel would just cover the media player. Essentially, I was kind of stuck because I didn't know how to control opacity correctly in WinForms.

After searching around a bit, it looked like handling opacity was not entirely trivial, at least where WinForms was concerned. In its stead, people recommended using the richer Windows Presentation Foundation. So after a day or two of WinForms failure, I switched to WPF. WPF... definitely had its learning curve that I struggled with. I'm not so sure why. It's been a while since I initially started using it, but I remember a lot of small things added to an uneasy experience. Maybe it was just the fact that I had to learn a lot of incremental things as I built one small feature at a time.

I think one of the things that really bugged me was that Visual Studio itself had some bugged bugs integrating with WPF. How I would have liked to work was with the Designer view on one screen and the rest of my program files on another, but this led to some funky problems where my Properties window started bugging out. It was kind of weird, like the window just refreshed itself whenever I clicked it, and I couldn't edit any fields. Putting everything on one screen seemed to help a bit.

Another thing that bothered me was doing - what I thought was - simple things like, I don't know, adding hotkeys. To do this I did some pretty roundabout CommandBinding thing that involved editing a few places and making a new class. I guess most of my problems here were just issues getting familiar with the framework. I probably should have spent more time learning the basics of WPF rather than trying to hack each one of my issues one by one.

(it's an order)

I don't know. I read a few introductory WPF tutorials online, but I think it's pretty boring overall to learn by studying. Me, I guess as a person, I really just want to sink my hands deep into an unknown project and learn as I go. And well, more often than not a frankenstein of code is born as a result. I definitely felt some of that fumbling around in this rotoscope utility, but I'll take my experiences as I go and hopefully build on myself better in the future.

My first task was to display video. This, thankfully, was handled very gracefully by the built in WPF MediaElement. MediaElement, and pretty much everything WPF, comes conveniently with opacity controls to handle fading. Great, just what I needed. Next came video playback. I set up a progress bar underneath and worked on playing, pausing, and skipping video. The progress bar updates itself automatically through a timer. Each timer tick checks the progress of the video and updates the progress bar accordingly.

I set up the frames next. The start, end, and offset frame timings were all hardcoded. I was only planning on using the utility for this single video anyways, so I based all the calculations off of our eventually recorded video. Skipping through a video had to be handled carefully, making sure you jumped to the right time and frame.

I think even at the end of the project, there were still some bugs remaining with frame skipping. Sometimes, I would have an issue where lines/colors that appear on one frame magically gets moved to another frame. I'm not sure why this exactly happens, though my best guess is because of moving through frames too quickly.

The offset between each frame is somewhere around 190 ms. This value was chosen based off the bpm of the song, and I think roughly, 190 is the length of a quarter note. This value fits pretty close into the 5 frames per second, 1250 frames total estimates I made before, and though daunting, I thought this was doable. Never could I truly imagine though.

Finally it was time to lay down lines and color. Lines in the rotoscope utility worked differently than the stroke utility. In the stroke utility, you had to separately click to specify each point in a stroke. This was done for precision purposes. Lines in the rotoscope utility, however, can be made more quickly by clicking, dragging, and lifting to completely draw a line. The starting point is recorded where you first press down and the second point is where you lift up. To draw a line in real time, a timer was used to constantly draw an in-progress line to the mouse position.

You can only draw straight lines in the rotoscope utility because curves would be too difficult to render. osu! storyboarding does not support any notion of lines or curves. It only knows how to draw images and stretch them into shape. You can easily make a straight line by pulling a single block pixel some length and width, but you can't easily draw curves the same way. I was a bit worried about the quality having only straight lines, but after a few tests, I think they looked fine, and with further adjustments, they looked great in the final video.

Right clicking a line deletes it. If lines overlap each other, the topmost line is removed. A deletion distance needs to be adjusted/configured to feel natural. To delete rotated lines, some calculation needs to be made to measure the projection of the mouse position to the line; then projection is compared against the minimum deletion amount. You also need to take care of special cases such as perfectly horizontal and vertical lines.

Then I worked on coloring. Coloring, well, works quite a bit different than you'd imagine. You might be thinking something along the lines of maybe a paint bucket tool to color in all inner sections of the lines. But, again, osu! has no notion of painting, only drawing images. So, my alternative "coloring" was to draw giant blocks of white and rotate them into position. To draw a color block, you click and drag to expand a rectangle, then using the mouse wheel, you rotate the block in position.

To make color deletion simple, I use the rotation angle associated with a rotated rectangle. Using the angle, any rectangle can be rotated temporarily back in place to a normally situated rectangle. In this form, a rectangle/point contains check becomes merely trivial. The only other thing you have to account for is to properly rotate the mouse position accordingly with the rectangle center.

Saving an animation file uses class serialization and XML. Before you say anything, yes, XML is an absolutely terrible format to store pretty much anything due to excessive tags. Size... definitely became somewhat of an issue. I think at the end, the full animation XML totaled around 44MB. I chose XML because it's something I use a lot of it at work and wanted to play around with it more.

Regardless, I think XML was bearable, and the experience working with it was fine. On the storyboard C++ side, I used a simple XML library, tinyxml, to help with the parsing. tinyxml was fairly easy to setup and use, but it has its limitations as a simple library. One thing I couldn't avoid was having to load the entire XML file in at once, and that would cut quite significantly into generation time.

Automatic (de)serialization provided by Visual Studio was kind of mind boggling, in that it actually worked. Using it wasn't so difficult, but there were a few things I had to tinker with before I got it working perfectly. It turns out you can't serialize many .NET objects, only simpler types. My workaround was to create simpler classes representing lines, rectangles, and any other objects I needed. Then I could just serialize these instead.

I think that's probably it on the coding side. I'm kind of tired of writing this so I'll skip some details. Alright, I'll give you one more paragraph. I guess I'm pretty proud of myself overall for making and completing this. utility It's just a matter of execution. I had a design. I implemented it. I patched it up. And it's working, great. You know, it's the smaller things in life that keep you going, and this was a pretty cool small thing. When things get finished, I'm happy.

Oh, I guess to tack on, this utility actually got me promoted at work. My manager was thinking about moving me to development and asked if I've been working on anything recently. I showed him this rotoscope tool. He thought the utility was on the simpler side, but it was sufficient to demonstrate some basic coding abilities, and I got promoted from my quality control position. The worst I probably got nabbed on was getting called a script kiddie, and I'll nod my head to that one.

With the animation tool in decent shape, I made some test animations in it using Royce as a model, and it wasn't that bad, even at 5 FPS. I think seeing those first couple clips come alive really solidified everything for me. Then, I was concretely committed to completing the rotoscope no matter what and ready to see it to the (bitter) end.

Sample of the rotoscope utility and my favorite frame

The rotoscope utility was in okay shape by the time video day came some weekends later. Many fixes would be put in over the course of the next weeks as I started using it, but it was in a usable state to start working. Our video crew on film day consisted of Royce, Shannon, myself, and a few of our club friends. I think we were planning on going to Round 1 later that day and conveniently planned this earlier.

We didn't make much preparations for filming. The only thing Shannon needed to do was wear a flowy dress for the occasion and briefly review the storyboards. I printed out two extra packets of the storyboards and handed them out for us as reference. Pretty sure some of the pages weren't ordered correctly, but you can blame the printer and my laziness on that one.

Video taking was surprisingly simpler/easier than I imagined. I was a bit worried since I never had any recording experience, so I was expecting things to go pretty bad. It helped that our storyboard and scenes were very simple and flexible. I was also very familiar with the storyboards, having made and looked over the material quite thoroughly.

This was the only day we planned on shooting. The point of this video was just to give me a reference to base off of. The idea and meme was that any problems or bad footage that we had would be fixed in post-production, by me, when I drew over the video. What this mostly meant was me having to add a few more features to the utility, namely the ability to reposition all of a frame's lines and colors by some distance to fix camera shaking.

.

The whole recording session probably took around an hour. The first half was spent sitting down and going through the storyboards panel by panel. Most of us had taken a glance at the storyboards before, but I wanted to clearly express my intent and ideas about each section. Oh, I should probably talk about our location as well.

For our shoot, we just needed an open area where Shannon could walk around freely. There was a large field next to where most of us lived, so we chose that pretty convenient spot. On a Saturday noon/afternoon, it was pretty much empty save for maybe one or two other people. There were a few environmental issues that came up, but we dealt with those.

One problem was the fact that the wind was blowing pretty hard/unexpectedly. That meant we couldn't shoot a lot of the artificial windy sections of the storyboard that I wanted, but we could cut those out. On top of that, the sun was pretty strong in the sky, so when Shannon had sections lying down, she couldn't really keep her eyes open due to the light. Our somewhat halfshod attempt at fixing this was having someone hold cardboard over her to keep her in the shade, but that didn't really help much. Guess we should have filmed later or earlier.

rip

After sitting and reviewing the whole storyboard, we rehearsed each section individually. It probably took another half hour to run through the entire thing once piece by piece. We had to figure out a few logistics on the spot. Royce played the background music from his phone and also looked through his storyboard packet to tell Shannon the upcoming parts. Brenn helped Royce with some of his paper organization. I would be doing the main filming. By the end, I think we were in okay shape.

Then came our yolo take. The final video that we based everything on was just a single take from start to end. Something like 5 minutes and we were done. It looked... okay, kind of bad. We sat down and reviewed the final footage, and we all laughed cause it was definitely very janky at some parts. Just for our embarrassment sake, I don't think I'm going to post this video here.

Probably the worst sections were where a lot of movement was involved. Particularly, the running scene saw a lot of shaking when I tried to keep up. And the part where I moved down to rotate in front of Shannon was also pretty choppy. I didn't want to have to deal with multiple video files, so I told myself I'd fix all the problems through post-production, somehow.

That concludes filming, and now we get to the drawing, the big meat of the project. This is by far the most tedious, boring, and difficult part of the whole thing. From mid July all the way to the end of September, I spent my free time drawing pretty much every single night. Yeah, I felt like I lost my mind. It takes a certain loose screw I think to actually spend this much time accomplishing something like this.

Well, I feel like I prepared myself mentally enough though. I crunched the hours I needed to work, and that was give or take on the mark. I had a general idea of the time, effort, and cost of the animation work. I don't think I burned myself out too quickly. Each day's of work would, on a good day, equate to about 5 seconds for a weekday and 10 seconds on a weekenday. It's slow, in the grand scheme of 270 seconds, but you gotta keep chugging.

It's good to see your work come to life in front of you, even if it's just a few seconds at a time. I think I started getting a better hang of rotoscoping as time went on. Less deletions. Faster stroking. More familiarity using my tablet. Right, yeah, that's another thing, you probably want to be pretty comfortable with a drawing tablet or something before you attempt something like this. Just a suggestion.

You can see the overall style of the project solidify as time goes on. The first 30 seconds, I experimented with a scratchier look, where I roughly/quickly sketched in all the lines and only cared about their general placements. Later, however, I would be more careful and minimal with line placement, making everything connected and aligned properly.

Why I decided to keep the old style in, hmm, I think I was just too lazy to redraw it. I think I fixed a few frames, but then was like, damn, do I still want to do this for another 130 frames? Nah, I think I'm good. Royce said it didn't really bother him, and I think internally, I chocked it up to some artsy aesthetic like a oh, the style is evolving, yup, over the course of the animation. It's art dude.

This is art

What the hell a friend actually just linked me this

Over the course of the rotoscope production, I frequently checked how the animation looked of inside osu!, just to see if everything was okay, check if I missed any obvious strokes, and most importantly, pay attention to size. Size was something I was scared about, and as the 30MB limit started to encroach upon the project, it was time to start doing some optimizations.

The first thing I did was make a Sprite Pool to cover all the sprites. To avoid creating a new sprite repeatedly, line sprits would be taken from a larger pool. If the pool is too small, additional sprites would be added and referenced from there on. The easiest way to save space is to reduce as many sprite commands as possible, so I cleaned up my code with this in mind.

Specifically, fading was a bit of a tricky case. A lot of times, you don't need to fade a sprite at all, because if you use, say, sprite #50 from the sprite pool in one frame as well as the next one, you can keep the fade constant at 1. You only run into an issue if sprite #51 is used in one frame, but not in the next. Then you would need to set the fade of sprites #51 to the last #index to 0.

After refactoring my code, it was difficult to improve space much further. While I could remove some commands, at the end of the day, each line sprite must still be moved, scaled, and rotated, and these limitations are always be constant. While looking over the generated OSB file one day, however, I was able to make some unexpected improvements.

This all came about because I saw a weird dip in OSB size. I recall something similar occurring in Dualive, but I couldn't pinpoint why it happened. After seeing it happen a few times and performing a few tests, I found out it was because osu! applied its own optimizations when you saved the map in game. For example, if the starting and ending values in a command are the same, osu! will blank out the ending values to save space.

I started to pay more attention to the storyboard file to see what else I could reduce, and there turned out to be a few interesting areas to attack. One was file name. Each sprite image references a path relative to the map directory. So instead of making a sub folder and putting files in there, it saved a lot more characters to reference files a single character long. For example, compare "Storyboard/Blank.png" vs "b.png". That's something like 15 bytes * #sprites saved.

But where I saved the most space, however, was with floating point precision. When writing floating points into a file, my library allowed up to something like 6 digits of precision. That was very expensive and absolutely unneeded. For move commands, I removed decimals completely without noticeable consequence, and for other commands, I added an optional precision argument to signify how many decimal places to allow.

Testing a few things during this time, another improvement I made, not optimization related, was changing how the lines were drawn in osu!. In the beginning, I used a 1x1 blank white image that could be stretched into place. I came across a suggestion to add a transparent layer of pixels around a middle pixel, and I tried that to create a more anti-aliased look. And the results are as you see in video. I had to increase the scale factor by some amount to stretch out the lines some more, but I really love the natural, soft feel and am glad I came across this improvement.

After the lines were close to completion in the animation, Royce and I met up for, what turned out to be, our only in person meeting to plan out the rest of the project. Anything else we mainly discussed through Discord. Our meeting consisted of concretely deciding what effects to make, where to place them, reviewing the finished line animation, and handling a few miscellaneous tasks.

Up until now, Royce had been getting familiar with the library and made a few dot/particle effects to showcase in the background. The three effects he came up with were: DotGenerator, generating a bunch of dots with Poisson distribution and having them bounce off walls; DotWave, a sine wave dot pattern that can move up and down at different speeds; and DotTree, a branching effect where one branch can split to create a tree like pattern.

This should all sound somewhat familiar if you've seen the storyboard because they are the exact effects we put into the background. Royce created general classes for each effect, and had samples of what they could look like. I say sample very roughly, because none of his effects could be integrated in as is. They would require a lot of fine tuning to become what they are now.

Royce's last remaining effect was to do lyrics, which he had been putting off for some time. I would help work with him on this, doing my part to update the stroking utility while he worked out the drawing in game. So far, I thought he was doing pretty well for a new coder. He was gone for most of the summer in Japan, so he actually didn't have a lot of time to do all of this. Well, later, we would have to revisit all of his effects some more, but, we'll get to that.

During our meeting we had a review of my animation. I wanted his opinion on what sections needed improvements and what parts we needed to shift/change. His main complaint was the lack of hair details in particular sections. I agreed with this. Originally, I didn't include details in some parts because I was worried of space constraints. With the above optimizations, however, we were in the clear and had plenty of room to add more details.

Additionally, we needed to move some parts around to fit in the lyrics. The second lyrics section, in the blue, was only made possible because I moved the model manually to the right. This required adding a new feature into the rotoscope utility to allow movement of lines and colors. We also noted that a few sections, like the running, were way too choppy, so I needed to manually adjust that section also.

Finally in our meeting, the last thing we did was make our background picture. This was actually kind of a intricate process. The characters on the envelope were formed by having Royce write onto notebook paper, cutting out the paper characters, and taping them to the envelope. Then he got in position, and I recorded a video of him pretending to write on the top.

Why a video instead of a picture you ask? Because the next step was to use my rotoscope utility to generate lines and colors so that I could put the single frame animation into osu! and use the game to render it for view. Pretty good idea right. The first take of the background was pretty good, but I had to make some later adjustments after finalizing the dot effects and switching to use a transparent block for lines and color.

Final background image

A very productive meeting. We then went our separate ways and started working on our tasks. For Royce, however, I asked him to do one extra special task for us to help speed up productivity. We were starting to run into time issues trying to test the storyboard because it would take way too long to generate the animation portion of the storyboard.

To alleviate time, we made several adjustments. I added timestamp parameters to the animation class constructor so you could specify when the start and end times you wanted to generate the rotoscope. Royce, on the other hand, worked on a bitmap to signify which effects should generate at runtime.

The bitmap worked by only generating specified effects. Each effect would be saved into a local OSB file. If the effect was set to 0 in the bitmap, then nothing was generated, and the existing file was used. At the end of the storyboard generation, all the local OSB files were combined into one collective storyboard file.

All things considered, I don't think the bitmap was that useful compared to just straight up commenting in/out sections. The only time time became a significant issue was if you tried to generate everything together, which that rounds up to about 45 or so seconds. You cut down time significantly by only drawing a small portion of the animation.

As for me, I first worked on redrawing. I not only went through and added hair details, but I also cleaned up frames in this section that I thought weren't up to par quality. This took me longer than expected, but I think the end result was worth it. Then I added tools to allow moving frame lines and colors, and I fixed up all the shaky cam motion with manual adjustments.

Next it was time to color. My first attempt at this is similar to what you see in the final product. I used the transparent block to color, and it results in this rectangular glow effect. I thought it looked okay, but my original intention was to use solid blocks to color everything. I tested coloring with solid blocks, and after some feedback with Royce, we decided it wasn't going to work out this way. The overlapping looked too awkward and jaggedy.

So I just drew giant glow rectangles, which for me, timewise, worked out great. It's easy and quick to drag a simple rectangle over the whole figure. The only thing I did extra was hard code particular sections in and out. This was matched down to the frame count.

Coloring was done very fast. It was done so fast I had time to try different variations of coloring, without much success however. Stacking rectangles doesn't look that great in my opinion, and I think the consistency of just 1 glow rectangle was good. There's also the problem with SB load. Since glow rectangles were very large, introducing multiple large rectangles quickly causes load issues. SB load is kind of meaningless criteria these days, but it's still something I like to adhere to. Less is usually always better.

After coloring, I moved into background effects to help out. My first task was to finish out the ending. I knew I wanted to do an Okaerinasai tribute, but the execution took a few tries. My first attempt was to divide the background into Polygon shapes and then fill each with a particular color. This didn't give much variation, however, so I decided to just grab the color directly from an existing image.

To pull color info from an image, I used libpng. The C functions were a little bit unwieldy, so I made a wrapper to get that information in friendlier manner. To create a more even distribution, I followed Royce's example in using Poisson distribution, but instead of using his library (that he claimed was broken but was actually not), I made my own Poisson disk algorithm after a bit of research.

I had to go a few steps further to match the image to the background properly. Shannon isn't standing in the frame at the same level as the Okaerinasai girl, so this required some Photoshop correction. I measured the model's position in osu! and cropped down the Okaerinasai background image to match the coordinates. And with that to reference from, I was able to test, modify, and finish out the ending.

By this time, I think Royce got a good chunk of his lyrics code into the storyboard side, and with Okaerinasai finished, I took the opportunity to update the stroke utility with new features to support Bezier curves. We fixed a few bugs, and with nothing left to pretty much do, I started to intrude into Royce's effects territory, tailoring his effects to what I envisioned in the background.

Getting the look I wanted required a lot of fixing and adjustments. The first effect I tackled was dot generation. For this, I had a specific pattern in mind that I wanted to replicate. This came from a video (below) of Glassy Sky, a song from Tokyo Ghoul. I haven't seen the show. I'm a little too hipster to fall into that mainstream kind of stuff. I think the song's fine though. It's growing old on me now, but the takeaway here, let's not forget, is the background image and effects.

Having this concept in mind was not good enough on its own, however. Royce didn't make his effects particularly flexible, and I needed to add additional parameters for the functionality I wanted. For DotGenerator, it looks like I added parameters for scale, speed, color, fade, image path, and expand amount in the constructor.

I kept very little of Royce's original dot samples. His vision of the dots consisted of large blue dots shifting colors and bouncing around at a pretty quick pace. My Glassy Sky idea was much slower and different than probably what he imagined. With him busy, however, I took control of the design and did what I wanted, pending his approval/review.

There were some bugs and problems that came up. It was probably a combination of Royce's inexperience and my difficulty in reading code (I consider it one of my weak points), but I struggled to fix some bugs on my own and shoveled them to Royce. In DotGenerator, there was a funny situation where he thought the PoissonGenerator code he grabbed online was causing an infinite loop, but that turned out to be from his own code. I showed him the affected lines and he went off on his own to fix them.

I worked with DotTree next, and this was by far easier to use. I don't think there were any jarring issues with the branching that I had to deal with, and I made minor modifications to allow for more specific scaling. Royce had 1 or 2 trees already designed, and I drew inspiration from these and worked off of them. Overall, I think the end result looks... a bit odd. I don't know, it's just kind of weird sprouting trees from nowhere. I like our patterns though, especially what I dub the "Quintessential Tree."

Finally there was the DotWave section. This needed some extra work. I complained that I couldn't figure out what the parameters did. In my offense, I'm not very good at wave math. What I wanted was to be able to have the wave bend in sync with the music, but that wasn't working with the values I was throwing in. Royce took a look at it, found the problem, and patched it up.

And now working, I put in the configurations to set up the wave accordingly, and the end result is as you see. I feel like this is probably the most out of place effect in the grand scheme of the storyboard. As some have commented, it feels too fast and jarring with the rest of slow effects. I also had my concerns here, so to ease the wave in and fill the remaining space, I used DotGenerator to add a lot of, what I called, sprinkles, into the surrounding.

With good timing, Royce finished his lyrics soon after, and I think they look really nice. Per his intuition, the curve strokes turned out very well. I didn't have too much to critique, complaining only about how the way a few specific characters were stroked and the position of some lines. Royce fixed those, and we were good to go.

And that's that. I set up OBS, took a few video recordings and was ready to finish. But, before we could release, I wanted to wait for our beatmap. Kenny, a friend from osu! club, offered himself as sacrifice to make a mapping contribution for us. He worked on and off for two or three weeks, and by the end we had something pretty much complete, hit sounds and everything.

I got a chance to offer some basic modding advice to him. Not so sure how much he found helpful, but I personally found it an interesting experience. I pretty much never tap into the mapping realm. I think it's just that... I don't find mapping very easy. There's so much variation involved, with design and principles, it's kind of hard for me to judge and say, hey this is bad, or this is good. I mean, it just feels so subjective. It doesn't help that all my favorite maps and patterns pretty much all fall into the crazy, cancer, unreadable category.

When Kenny finished, we released officially. I posted all over my social media, and our biggest boon was definitely from Reddit, where /r/osugame was very gracious in its compliments. Within the first couple days and now teetering out, our YouTube video is up to 2K views, which to me is a pretty big achievement, considering Dualive is still sitting somewhere just under 300. I probably should have shared Dualive on osugame too, but ah, no point in going back in time now.

Apparently the storyboard is pretty laggy and might just crash your computer

It's been a good couple months. Drawing was definitely hard, but honestly, I feel like I'd do it again. Next time, I'll even add more action, more frames, and maybe even draw everything from scratch. But now, I think I'm ready to move on. Storyboarding, hmm, it's fun. I feel like I definitely haven't reached my peak in it. I think I'm well known in a few circles for what I've made, but since I release so slowly, I'm not that popular in the overall community.

I have a ton more effects I want to bring to life, and I do want to get to them all one day. I think I'll take on a storyboard occasionally, collab with some new folk and produce some more awesome things. I like doing this stuff. Making visual things is fun, and I hope to keep doing this forever, maybe someday somehow get a job into it.

But right now, I think it's important for me as a person to branch out and take a step back for now. I feel like just as a programmer, I've stagnated, working only with osu!'s limited storyboarding. I want to explore and learn more. Networking, graphics, security, GUI, these are all things that interest me, but aren't something I'll gain easily through storyboarding.

My next project is going to be something along the lines of yet-another-rhythm-game, Max style. I'm calling it S2VX. I don't really have everything planned out for it just yet, but I'm not gonna give up on storyboarding entirely. S2VX will have some flashy visuals, and I'm sure I'll crawl back to osu! in due time as well. I suppose at this rate, we'll see one osu! storyboard from me per year. November 2018 it is.

I'll leave you with this. I dream of a world where this was our rotoscope base.