Robotic Pandas

On abstractions for building next generation user interfaces

My first DirectX 10 Shader in Bling

Posted by mcdirmid on 13/07/2009

Here is the code:

var Buffer = Geometries.Sphere.ToBuffer(100);
Buffer.ForAll = (i, v) => v.Color().Bind = slider.Value.Lerp(i.SelectColor(), Colors.Red);
var EF = Effect.Shade(vertex => vertex.Color() * (vertex.Normal().Z));

And here is the result:

clip_image001

Ya, it doesn’t look like much yet, but it’s a good start. I got a lot of help from reading Conal Elliott’s vertigo paper, so it looks like this has been done before in Haskell vs. C#. In particular, we can get a lot of mileage out of parametric surfaces (automatically compute positions and normals using automatic differentiation).

Advertisements

Posted in Uncategorized | Leave a Comment »

Bling WPF hits V1

Posted by mcdirmid on 08/02/2009

I’d like to announce a new and improved version of Bling WPF. In this version, we have redone the wrappers around WPF databinding and pixel shading for better usability, while a lot of documentation and examples have been added to the distribution and the Codeplex page. Finally, we’ve also added some experimental support for UI physics with an example! A release for Visual Studio 2008/.NET 3.5 SP1 is available at http://www.codeplex.com/bling. For anyone unfamiliar with Bling, here are the primary features:

  • WPF Databinding without IValueConverters in C#!  For example, “button.CenterPosition.X = slider.Value * MainCanvas.Width” is valid C# code in Bling  that will setup a databinding relationship that binds button’s LeftProperty to something that will move it with the slider.
  • WPF pixel shaders in C# without HLSL code or boilerplate! A pixel shader is simply a texture-to-pixel function, e.g., “canvas.CustomEffect = (input,uv) => slider.Value.Lerp(input[uv], ColorBl.FromScRgb(new PointBl(1,1,1) – input[uv].ScRGB, input[uv].A));” is a one line pixel shader that will invert all the colors in canvas interpolated with respect to a slider’s current value. No need to write HLSL code, no need to write a custom effect class.writing a pixel shader is boiled down to its core function.
  • Bling defines many WPF convenience properties; e.g., Size is defined as (Width, Height), Right is defined as Left + Width, CenterPosition is defined as LeftTop + Size / 2. Convenience properties are just like properties that are backed directly by dependency properties; i.e., they can undergo databinding, be used in pixel shaders, and so on.
  • Bling code is completely compatible with conventional WPF code. Bling wrappers are stateless so you can use Bling functionality anywhere in your program regardless of architecture.
  • UI Physics! Did you wonder what would happen if property bindings were solved via a physics engine rather than a databinding engine? Well, ok, probably not J, but the result is cool and could possibly be the future of UI toolkits. I’ll write more about this later.

Posted in Uncategorized | Leave a Comment »

New Bling WPF release with metaballs!

Posted by mcdirmid on 08/01/2009

No, not meatballs. I’ve done a lot of work on Bling this month, the first of which is writing a paper on the technique used to build Bling. I’ve also overhauled how pixel shader effects are expressed so that even less boilerplate is required than before.  In the new release, when you want to add a signal parameter to a shader, you can simply call “Sh” on the signal and it will automatically be added to the list of the shader’s parameters.

As an example, consider code:

Bling.Shaders.Shaders.MakeDirect((txt, input, uv) => {
  FloatSh value = 0f;
  Point3DSh rgb = Point3DSh.New(0, 0, 0);
  PointSg xyscaled = canvas.Size() /
    (canvas.Size().X + canvas.Size().Y);

  uv = uv * xyscaled.Sh(txt);
  for (int i = 0; i < points.Length; i++) {
    var p = ((points[i] - canvas.LeftTop()) / canvas.Size());
    p = p * xyscaled;
    var v = (uv - p.Sh(txt));
    v = v * v;
    var at = 1f / (v.X + v.Y);
    value += at;
    rgb += (colors[i % colors.Length].Sh().RGB * at);
  }
  var area = canvas.Width() * canvas.Height();
  var at0 = (area / 400).Sh(txt);

  return ((value > at0)).Condition(
    ColorSh.New(rgb / value, 1),
    Colors.White.Sh());
});

This code mixes signal code and shader code to create a nice meta-ball effect. The xyscaled variable is a point signal that scales X and Y coordinates according to the dimensions of the container. It is computed outside of the pixel shader but is multiplied the pixel coordinate (uv) by converting it to a shader parameter (xyscaled.Sh(txt)). For each point used to create the meta-ball effect (which is formed by 8 thumbs), the point is scaled according to the canvas then re-scaled according to x and y dimensions (so the ball generated is a circle) using xyscaled, all of these computations happen through data-binding and not in the pixel shader saving precious GPU instructions and not replicating shader operations on each pixel since they don’t change. After these computations are performed outside of the shader, the point is brought into the shader (p.Sh(txt)) so it can be used in an operation with the pixel coordinate. Likewise, the area of the canvas is operated on outside of the GPU and brought into the shader using (area / 400).Sh(txt), where it is then used as the threshold for the metaball computation.

Check out the result (which is animated when you run it):

image

The meta-ball example is the main example in the new source code/distribution, which you can get from Bling’s Codeplex page.

Posted in Uncategorized | Leave a Comment »

Shading Blobs with Bling WPF!

Posted by mcdirmid on 08/12/2008

I updated Bling WPF to version 0.6, get it at the normal place (http://www.codeplex.com/bling). Mostly, I changed the DSL to get rid of more boilerplate code. Now you can create multiple input and parameter pixel shader effects with only a few lines of C# code (sorry, no XAML yet). Example of a blob shader:

var effect = new EightArgLiftedShader<Point>(); effect.ShaderFunction0 = (input, uv, points) => { FloatSh d = 0; for (int i = 0; i < texture.SegmentCount; i++) d += uv.Distance(points[i].LftSh()); d = 1 - (d / texture.SegmentCount); d = d * 2; var color = input[uv]; return ColorSh.New(color.RGB * d, color.A); }; for (int i = 0; i < texture.SegmentCount; i++) effect[i].Bind = polygons[j].RelativePoint(thumbs[i].CenterPosition()); polygons[j].Effect = effect;

An EightArgLiftedShader takes eight arguments of the same type (in this case Point). The parameters are then packaged up as an array of ShaderValue<Point> objects (points) where we then compute the average distance with the coordinate being processed (uv). The distance is then inverted and doubled to come up with a value to multiple the current color by. Outside of the shader, each point parameter of the shader is bound to the relative center point of each thumb that forms the skin of the polygon being shaded (basically, take the AABB of the polygon and compute the percentage that the thumb is inside the AABB). Result on shading three blobs:

image

A bit more 3D than a gradient brush!

Posted in Uncategorized | Leave a Comment »