Creative Technology Lab
Home ↗ Contact Instagram YouTube
HAS Studio Project

HAS Fluid Blob

A real-time raymarched SDF blob engine with PBR shading, softbox lighting, audio reactivity, and a full modifier stack — running entirely in the browser at 60fps.

WebGL GLSL Three.js Raymarching SDF PBR Audio Reactive
Year 2026
Type Web App
Status ● Live
HAS Fluid Blob / Raymarched SDF Engine HAS Studio — 2026
1,165
Lines of Code
17
Animation Presets
8
Mesh Modifiers
60
FPS Target
001 Overview
Fluid Blob is a browser-based 3D blob generator and material design tool. It uses raymarched Signed Distance Fields (SDFs) to render smooth, merging organic shapes in real-time, shaded with a physically-based rendering pipeline that includes GGX specular, Schlick fresnel, environment reflections, clearcoat, refraction with Beer's law absorption, and a three-light softbox studio setup.

SDF Raymarching
Up to 10 blob primitives raymarched per frame using sphere tracing. Blobs merge via smooth-minimum (polynomial smin) with adjustable merge radius, producing fluid organic transitions between shapes.
PBR Material System
Full physically-based shading with metallic/roughness workflow. GGX microfacet specular, Schlick fresnel approximation, clearcoat layer, IOR-based glass refraction, and volumetric Beer's law absorption for transparent materials.
Audio Reactive Engine
Three-band FFT analysis (bass/mid/high) with configurable routing matrix. Any audio frequency band can drive any visual parameter — size, merge, drift, roughness, fresnel, animation radius, modifiers, and more.
Modifier Stack
Eight real-time mesh modifiers applied in the SDF domain: twist, taper, bend, ripple, spike, inflate, squash, and melt. Each modifier can be toggled, stacked, and driven by audio.
002 How It Works
01
Animation System — 17 Presets
Each animation preset defines unique parametric motion paths calculated per-blob per-frame. Controls include speed, radius, symmetry, phase spread, wobble, Z-depth motion, chaos, easing, reverse, and mirror.
02
Texture Mapping — 6 UV Modes
Drag-and-drop any image to map it onto the SDF surface. Six UV projection modes (triplanar, spherical, cubic, cylindrical, planar, wrap), six blend modes, plus scale, rotation, offset, tiling, contrast, and bump mapping.
03
Studio Lighting
A virtual softbox studio with three-light setup: key light (warm, upper right), fill light (cool, left), and rim light (narrow, behind). HDRI-approximation environment map provides realistic reflections with horizon glow and floor bounce.
04
SDF Merging
Smooth-minimum polynomial blending produces organic merging between blobs. The merge radius is adjustable in real-time, controlling how aggressively shapes fuse into each other.
05
Audio Routing Matrix
Three-band FFT decomposition (bass 0–12%, mid 12–45%, high 45–100%) with configurable smoothing. Any frequency band routes to any shader parameter with adjustable gain — a node-based approach to audio-visual mapping.
06
Space Deformation
Modifiers deform 3D space before SDF evaluation — twist rotates around Y by height, taper scales by height, bend curves along an axis, ripple applies sine displacement, spike adds noise, inflate pushes outward, squash flattens Y while expanding XZ, melt droops by distance from center.
INPUT BLOB POS DEFORM MODIFIERS RENDER SDF RAYMARCH SHADE PBR + ENV OUTPUT CANVAS / PNG
Rendering pipeline — Blob Positions → Modifier Stack → SDF Raymarch → PBR Shading → Output
003 Features
17 Animation Presets
Drift, Orbit, Breathe, Spiral, Split, Pulse, Vortex, Flower, Wave, Lissajous, Pendulum, Bounce, Galaxy, Figure8, Jellyfish, Helix, and Kaleidoscope — each with speed, radius, symmetry, wobble, chaos, and easing controls.
8 Mesh Modifiers
Twist, Taper, Bend, Ripple, Spike, Inflate, Squash, and Melt — all applied in the SDF domain before distance evaluation. Stackable, toggleable, and audio-driveable.
6 UV Projection Modes
Triplanar, Spherical, Cubic, Cylindrical, Planar, and Wrap. Six blend modes (Mix, Multiply, Add, Screen, Overlay, Replace) plus scale, rotation, offset, tiling, contrast, and bump mapping.
PBR Material Engine
GGX microfacet specular, Schlick fresnel, metallic/roughness workflow, clearcoat, IOR-based refraction with Beer's law volumetric absorption. Three-light softbox studio with HDRI env map.
Audio Reactivity
512-bin FFT via Web Audio API. Three-band split with configurable routing matrix — connect any frequency band to any visual parameter. Bass to merge, highs to roughness, mids to drift.
PNG Export
Export current frame as PNG with standard or alpha-channel variants. Background is configurable between gradient and solid color modes. Perfect for hero visuals and brand assets.
004 Code Snippets
GLSL
SDF Raymarching Core
// Smooth-minimum blending for organic merging
float smin(float a, float b, float k) {
  float h = clamp(.5 + .5 * (b - a) / k, 0., 1.);
  return mix(b, a, h) - k * h * (1. - h);
}

float map(vec3 p) {
  vec3 mp = applyModifiers(p);
  float d = 100.;
  for(int i = 0; i < 10; i++) {
    if(i >= u_numBlobs) break;
    float r = (i == 0) ? u_coreR : u_dropR;
    float db = length(mp - applyModifiers(u_blobs[i])) - r;
    if(i == 0) d = db;
    else d = smin(d, db, u_merge);
  }
  return d;
}
GLSL
PBR Shading Pipeline
// GGX microfacet specular distribution
float ggx(vec3 n, vec3 h, float r) {
  float a2 = r*r*r*r;
  float NdH = max(dot(n, h), 0.);
  float d = NdH*NdH * (a2 - 1.) + 1.;
  return a2 / (3.14159 * d*d + .0001);
}

// Schlick fresnel — metals use albedo as F0
vec3 F0 = mix(vec3(.04), albedo, u_metal);
vec3 schlickF = F0 + (1.-F0) * pow(1.-NdV, 5.);

// Three-light softbox studio
vec3 diff = albedo * (1.-u_metal) * (
  NdL1 * u_keyL  * .5  +   // warm key
  NdL2 * u_fillL * .3  +   // cool fill
  NdL3 * u_rimL  * .25 +   // rim strip
  .15                       // ambient floor
);
JS
Audio Reactive Routing
// Three-band FFT → visual parameter routing matrix
let audioRoutes = [
  { band: 'bass',  prop: 'size',    amount: 0.5 },
  { band: 'bass',  prop: 'merge',   amount: 0.3 },
  { band: 'mid',   prop: 'drift',   amount: 0.5 },
  { band: 'high',  prop: 'rough',   amount: 0.2 },
  { band: 'bass',  prop: 'fresnel', amount: 0.4 },
];

// Accumulate modulations and clamp to safe ranges
function applyAudioToUniforms() {
  const mods = {};
  audioRoutes.forEach(r => {
    const bv = getBandValue(r.band);
    const cfg = routeProps[r.prop];
    mods[r.prop] = (mods[r.prop] || 0)
      + bv * r.amount * cfg.scale;
  });
}
GLSL
SDF Modifier Stack
vec3 applyModifiers(vec3 p) {
  vec3 q = p;
  // Twist — rotate around Y axis by height
  if(u_modTwist != 0.) {
    float tw = q.y * u_modTwist * 2.;
    q = vec3(q.x*cos(tw)-q.z*sin(tw), q.y,
             q.x*sin(tw)+q.z*cos(tw));
  }
  // Squash — flatten Y, expand XZ
  if(u_modSquash != 0.) {
    q.y *= 1. - u_modSquash * .6;
    q.x *= 1. + u_modSquash * .3;
    q.z *= 1. + u_modSquash * .3;
  }
  // Melt — droop by distance from center
  if(u_modMelt != 0.) {
    float dist = length(q.xz);
    q.y -= dist*dist * u_modMelt * .8;
  }
  return q;
}
005 Technology Stack
Three.js r128
WebGL abstraction layer. Provides OrthographicCamera, ShaderMaterial, and render loop. The actual rendering is 100% custom GLSL.
GLSL / Fragment Shader
The entire visual output is a single fullscreen fragment shader. SDF raymarching, PBR lighting, texture mapping, and modifiers all run on the GPU.
Web Audio API
AnalyserNode with 512-bin FFT for real-time frequency decomposition. Three-band split (bass 0–12%, mid 12–45%, high 45–100%) with configurable smoothing.
{ }
Vanilla JavaScript
Zero frameworks. Animation loop, blob physics, drag interaction, modifier stack, audio routing — all hand-rolled for minimal overhead.
CSS (No Frameworks)
Custom dark-mode panel UI with JetBrains Mono type, range sliders, collapsible sections, and drag-and-drop zones. No Tailwind, no Bootstrap.
Single HTML File
Everything ships in one file. No build step, no npm, no bundler. Open in any modern browser. The only external dependency is Three.js from CDN.
006 Who This Is For
Motion Designers
Create hero visuals, blob animations, and organic 3D elements for brand work, social content, and broadcast graphics. Export transparent PNGs or screen-record for After Effects and Premiere comps.
Creative Technologists
Use as a starting point for interactive installations, live visuals, and custom client experiences. The single-file architecture makes it trivial to fork, extend, and deploy.
VJs & Live Visual Artists
Audio-reactive routing turns any music into real-time visuals. Feed a DJ set, route bass to merge and size, route highs to roughness — instant generative visuals for live events.
Shader Learners & Educators
Well-structured GLSL with raymarching, SDF operations, PBR lighting, and texture mapping in a single readable shader. A living reference for anyone learning real-time graphics programming.
007 Where It Could Go
01
WebGPU Compute Migration
Port the raymarcher to WebGPU compute shaders for 2–4x performance gains. Unlocks higher resolution, more ray steps, and more complex SDF scenes without frame drops.
02
Video & GIF Export
Integrated video recording with MediaRecorder API. Export animation loops as MP4 or GIF directly from the browser — no screen recording required.
03
Preset Save & Share
Serialize full state into shareable JSON presets. Community gallery with one-click load. URL-encoded state for instant sharing.
04
MIDI Controller Support
Map physical MIDI knobs and faders to any parameter via WebMIDI API. Turn a MIDI controller into a tactile blob-sculpting instrument for live performances.
05
Node-Based Visual Editor
A node graph interface for composing SDF primitives, modifiers, color logic, and audio routing — making the shader pipeline visual and accessible to non-coders.
06
GLB / GLTF Mesh Export
Marching cubes extraction of the SDF into polygonal mesh, exported as GLB/GLTF for use in Blender, Cinema 4D, Unity, or Unreal.
07
Collaborative Real-Time Sessions
WebSocket-based multiplayer sculpting. Multiple users manipulate blobs simultaneously in the same scene — a shared creative canvas for remote teams and live events.
Explore the studio