I created this Augmented Reality (AR) filter for ByteDance (TikTok) which is featured throughout this section. Since its completion, I have used this project as a base for other filters.
In the past year, my filters have been viewed on Snapchat by users around the world over 420,000 times.
Segmentation: I used the segmentation effect to layer particles behind the user. Normally segmentation is used to add in an image behind the user. After a few tests, I was able to get the user’s background to remain while having the particle effect played behind them as opposed to an image overwriting the entire background.
Aura Particles: I created this effect by adjusting the emitter of the standard particle effect. Its purpose is to add a colored glow behind the user. I achieved this effect by modifying the alpha values of the default particle and adding an upwards velocity to make the effect appear to rise around the user.
Dust/Debris Particles: This effect appears after the power up effect is used a second time (blue mode). This is to add more intensity to the effect. It is made from using small floating debris and then adding more intensity to the upward velocity. I wanted it to match the velocity of the Aura effect.
Audio: There are two audio files I edited to match the audio guidelines of Lens Studio. Tracks are under 15 seconds long in .mp3 format with mono sound.
Eye Color: The eye color of the user will change depending on which effect is available. Standard will have a teal eye color where Blue will have a darker blue color.
Head Binding: a 3D hair model will be bound to the users head when the effects are activated. The color of the hair will change based on which effect is in place. Standard will have a blond color where Blue will have a blue color. I also added a Rim Color effect to the hair material to add a darker accent material to the outlines of the 3D model.
Retouch: I used this for very minor tweaks. The teeth will get a little whiter, eyes sharper (to help with the eye color effect), and the skin slightly softer.
Face Liquify (Eyes): I used this effect to enlarge the eyes slightly when the effect is active. I thought it would add to the likeliness of the show I based it off of.
Face Stretch: I used this to compliment the Face Liquify effects. Most of the modifications are to make the eyes look more natural when the Face Liquify effect is activated.
Interaction Manager Script: Opening the user’s mouth will begin the effect. The effect starts by enabling the orthographic and effect cameras (which include all effects listed in Features). After the effects have been activated, there is a 10 second wait before turning the effect off. Originally I ended the effect on mouth close but I wanted the user to be able to be able to use the effect without thinking it would end if they closed their mouth. If the user opens their mouth again while the effects is running, it will up the intensity of the effect, changing the colors of some features and including additional particle effects. The timer will be reset if this effect is called and will end normally in 10 seconds. The user can repeat the effect as many times as they like within the 10 second timer. If it ends, they can always use it again but if the additional effects were used, they will be reset (and can be achieved by following the same process as the first attempt).
Audio Manager Script: I have two audio tracks in the project. The first track plays an initial power up sound and the second track is a continuous loop of the powered up sound in an idle state. The first track will always play once if the user opens their mouth (follows the logic of the interaction manager). Once the track is done playing (5 seconds long), the second track will start and continuously loop until the effect ends (when the 10 second timer runs out).
Mute: Allows users to mute audio on Tap. Tap is used to toggle between muted and unmuted. I included this as an extra feature in case a user didn’t want the audio to play as they recorded a song or to include the audio to layer over a song.