Praxis LIVE v2 - hybrid visual IDE for creative coding

Why does gstreamer-java still use an obsolete version of Gstreamer then??

Gstreamer is still unable to read numerous HD videos in OGV, there are 2 bug reports about this bug :frowning:

Did you miss the bit about 1.x support above? There are now GStreamer-Java bindings for both 0.10 and 1.x. See https://github.com/gstreamer-java/gst1-java-core Happy to answer any specific questions you have about that - start another thread.

( … quick update, partly so I don’t have to nag @Riven to keep this topic alive again ;D … )

After a busy couple of months I’m finally finding some time to get working on a v2.3 release. There won’t be a huge range of new features for this release (I’ll leave that for v2.4), but it will include some performance improvements and added features for video playback (inc. video rate / reverse, and audio finally!). There’s also a new wrapper component for Processing 2D OpenGL, which will perform better if you don’t need 3D rendering, and some nice interface improvements like comments and colouring.

One of the things that’s kept me busy over the last couple of months, but also the reason for some of the new features, is this museum interactive. Looking like an old magic lantern, it’s actually using an LED projector and Intel compute stick, RFID tags in the slides to trigger videos, and an IR sensor / rotary encoder (brass handle) to control some simple video FX. Software is Praxis LIVE obviously, using the TinkerForge bindings for sensors. Steampunk VJ’ing here we come. :smiley:

The user base has also been growing over the last few months. Great to see this video from an artist & VJ in Montreal, who’s also provided some really useful feedback. Video takes a little while to get going, but love seeing something so different to what I use it for.

2Oa36w4g93M

And finally … just been accepted to do an intro to Praxis LIVE at Libre Graphics Meeting this April.

If anyone here is in the Montreal area, I’m doing a Praxis LIVE workshop this Thursday evening (June 16th), looking particularly at video, live-coding Java and Processing integration.

I’m also doing another workshop in Oxford (back in UK!) on Sunday June 26th as part of this.

And here’s a recent video demonstrating support for audio and visual live coding - best viewed full screen!

c1rI6_Lg3eQ

I will take a look at this in the future when it’s very well developed and stable builds are being released. As I read, its in alpha or whatever. I use eclipse and am tired of it.

Hydroque, that was a completely unnecessary post.

@nsigma: man I’ve been watching Praxis Live silently for the past few years and I’ve gotta say I’ve been incredibly impressed the entire time. Glad to hear you’re seeing a growing userbase and seeing videos your users are creating! That definitely is a great feeling.

Quick question: what is “Steampunk VJ-ing”?

@Hydroque - Praxis LIVE has been out of alpha and very stable for a good while now. It is not, and never will be, an alternative to Eclipse - it’s aims are very distinctly different.

Thanks @ra4king

Well, Steampunk VJ-ing. Just seemed appropriate for a Victorian-looking device that allows you to play with video FX. ;D

I’m very interested in your shader node editor component. Since your code base is really large - how big are chances that you/one can extract the node editor canvas, so that it could be integrated and reused in other gui applications?

Thanks! I wouldn’t necessarily call it a shader node editor, though - shaders are still written in code. The graphical node editor works at a level above your Java or GLSL code, providing a way to easily route and reroute data through your code.

The graphical node editor is based on top of the NetBeans Visual Library, which is Java2D based. It is theoretically feasible to use that as a library outside of the NetBeans platform. My extensions to it are mainly in the praxis.live.graph module.

Heya,
I’m working on a little project for a play I’ve got in the works and have a query.

I’m using the built in video:analysis:simple-tracker with a processed input stream like the examples and I’d like to use the blob info it generates and render on the original non-processed input stream but if you don’t connect the output of the tracker it seems to just shut off. I presume an optimization because that channel isn’t being used for anything. Now at the moment I’m just fading the two with a mix ratio of 99% so you essentially can only see the non-processed channel but just wondering if there is a better solution?

Also simply amazing program. Love it.

@quew8 - Thanks! Glad you’re having fun with it. Please do share images, video. etc of your project when you’re done. Sounds great.

So, yes, this is a bit of a known issue, and I keep thinking I should make a little component for “side-chaining” like this. You’re right that the channel is being switched off as an optimization. You could switch to using video:composite. Put the blob processed stream in dst, and the none processed stream in src, with mix on full. That should do the trick.

Or, create a video:custom and edit the code to -


@In(1) PImage in;
@In(2) PImage sidechain;

public void draw() {
  copy(in);
  release(in);
}

That should also do it.

A release candidate of Praxis LIVE v3 is now available, with lots of exciting new features. It’s feature complete, but not fully tested / documented yet.

I’m currently on my way to Canada for the International Conference on Live Coding, where I’ll be presenting a paper and demo using Praxis LIVE.

The full release will be done ASAP when I get back (yes, ran out of time! ;D ). In the meantime …

v3.0.0-rc1 download - https://github.com/praxis-live/praxis-live/releases
v3 key changes - https://github.com/praxis-live/support/issues/34

So, Praxis LIVE v3.0.0 is available - well, actually has been available for a few weeks now but I forgot to update this … :persecutioncomplex:

Downloads from www.praxislive.org, source code on GitHub.

Lots of changes have been made under the hood, including updating the embedded compiler to Java 8 with some new lambda goodness in the API, performance improvements in the OpenGL pipeline, updating GStreamer video support to 1.x, and revising all the audio components to be stereo and live-codeable. Distributed hubs support has been improved with the ability to run the compiler on another process / machine, and a lightweight HTTP file server added for asset sharing. And a new internal type for binary data (needed for bytecode) makes it easier to pass complex data around (eg. FFT data - see video).

See the full list of changes at http://praxis-live.readthedocs.io/en/latest/releases/

Since the release I wrote a blog post about Cyberphysical coding on the JVM, which also talks about some more general uses of live real-time coding and links to my ICLC paper mentioned above. If this interests you too, say hi! :slight_smile:

I’ll finish with some Funky Origami - here’s a video rendering of one pass through one of the new examples using some of those new features, demonstrating generative audio and controlling a simple 3D shape with FFT data.

VBaY2_XXlaI