domingo, 20 de octubre de 2019

Oil Paintings (III): Unknown

New painting from what seems to be a XIXth Century Military. Let me know if you identify the portraited gentleman and the author. I got a bad picture as reference from a Russian Art Book. Not sure if the author is russian. The figure reminds me of the English Army but not sure at all. What do you think?



I had a great time painting the reflected red colors in the left gauntlet and the gold ornaments of the costume.

sábado, 25 de mayo de 2019

Oil Paintings (II) - Albert Bierstadt

Here i present an exercise i did using one of the most known works of the hudson river school. It's Albert Bierstadt this time if i recall correctly.

Quite challenging for me. Here the difficulty relied on several parts. 



First all the tiny details and vegetation in the foreground. Ended up mixing several different colors and brush strokes to convey that feel of mixed thin grass leaves of different types and the ocres and less saturated colors of the stones and grains.

Then the feel of old bricks in the right facade that is being eaten by nature little by little and like specially the yellow highlights of the grass in it's top part. Im quite happy with the colors and the texture.

The sky was the first thing to do since i try to work from background to foreground, thus covering layers. The clouds, pastel-ish atmosphere of the original painting here was certainly difficult for me, specially if we take into account the hard times i had in a previous  exercise where i was painting huge sky sunrise where the primarily light was hitting on a backlit inmense set of clouds. Really happy how it turned out this time.

At last, the background mountains where we can see three major depth planes, each one more and more whitewashed. I had difficulties trying to get and match the middle toned colors of the set of mountains in the middle plane since there where an inbetween of the almost like sky colors of the mountains at the back and the more saturated ones closer to the viewer. I put several layers of mixed oils until i finally got the desired, something in between, atmosphere.

viernes, 10 de mayo de 2019

Github repository

INTRODUCTION

The idea of opening up my own github repository has been around my head for some years now. I ve been putting it off for a long time, the reason being i ve always had my personal backup of script code and utilities ive developped through the years at home but the fear of several hard disks crashing including my home NAS finally made up my mind and give github a shot.

It's purpose is primarily for personal use, im not too keen on spending time uploading the unit tests or even commenting unless it is fuzzy enough not to understand.

FIRST PROJECT: SERIALIZER

https://github.com/juan-cristobal-quesada/serializer

In python we have several built-in methods to serialize objects: json, pickle, .. etc. The JSON module serializes only basic types and some built-in datastructures whereas Pickle/cPickle attempts to serialize all custom class objects.

There are several further modules implemented in the python repository that intend to solve different issues with serialization. My current implementation relies on cPickle because of it speed but leverages the final serialization object by limiting the type of variables that are serializable. This comes specially handy if you intend to send the serialized object over a network. Fine tuning which objects get serialized and which dont allows more control over the size.

The serializer in this project allows for basic types serialization including basic lists and dicts datastructures which covers pretty much the core data of the objects we needed to send as well as a special class called Serializable intended for any custom class to inherit from in order to be serialized. In the process the path of the module is appended so that it can be correctly reconstructed at the endpoint.

The resulting object is then base 64 encoded so that it an be ascii compliant, for example allowing to be passed to another subprocess as an environment variable.


FURTHER IMPROVEMENTS

- extend the base serialization and add a readable format such as json notably for debug purposes.
- add a zip compression functionality
- add an encryption functionality so that the serialized object is protected when traveling through the network.
- add support for more built-in basic types such as OrderedDict and others.
- add unit testing cases to showcase the usage.

sábado, 6 de octubre de 2018

Oil Paintings (I)



I just wanted to share one of my firsts oil painting exercises. This was done a long time ago. I personnally like the contrast between the hard blue shadow and the grey-yellowish texture of the background where the spot light hits more intensely.

Also, i like the tones of the pink glass and the highlights of the vase. Given my yet poor expertise and as a fresher, i was quite satisfied with the results.

In my free time im now involved in some anatomy studies but as soon as i finish i will try another one inspired by the hudson river school. One of my art teachers discovered this movement to me, and im really drawn to those wild almost fantasy landscapes of the american colonies.

viernes, 2 de marzo de 2018

Integrating DCCs into VFX pipelines: A generic Approach (I)

Context

Most VFX houses and boutiques tend to develop their pipelines around a core set of engines and digital content creation tools. The pipeline grows to deal with file/folder structures, asset tracking (adding to the equation any, usually web-based, digital asset management tool) across departments, some sort of data/metadata storage... normally combining serializables (JSON, XML, ..etc) and relational databases....

This implies a whole bunch of development work so when building a pipeline it's important from the technical point of view as well (not only art) to take into account the programming languages, available APIs, compiler/interpreters versions. It's a great deal. But once the choice is made, studios normally stick to them for several years and hence with the elected DCCs. Changing DCCs requires adapting the pipeline to support it and normally this task is parallelized so it doesnt have an impact on current productions.

Developing for a fixed set of DCCs implies one can spend time using their apis at full, code separate tools and make efforts to integrate them in the best artist-friendly way one is capable of. For example, if you plan to develop a working files manager for Maya and Nuke you may develop some core functionality that is common to both, but you wont trouble yourself much in making a unique tool talk to both. Instead, because you can afford it, you will most probably insert these core funcionality (because you hate to repeat yourself) in different widgets for each application (think about having the tool embedded in a tab).

But the approach has to be different when your plan is to integrate any possible existing 3D software out there in your pipeline!

It's easy to understand you cannot afford at first developing the inputs/outputs tools for a particular soft when you a) are part of a highly specialized and agile but not that numerous team, b) you dont have all the time in the world. So you need to take a more generic approach.

Generic Approach

How about developing your core pipeline tools as standalone instead of having them embedded in a specific app? You are not completely bound to the app specific programming language, you restrain only the specific atomic actions to the software.. the rest is handled from your core tools, and you are no more dependent on each Graphics Library API and versions. Imagine you could develop tools that work for the different Maya versions without relying on PySide and PySide2, integrate Cinema 4d (which doesnt have any Qt binding), Blender (which is Python 3), Photoshop and all the Adobe Suite....

In the approach we are taking currently we are developing our core tools in Python 2.7/PySide (because it is a widely used programming language in vfx and you can get away) and using different kinds of interprocess communication notably via socket connections.

But it's not gold all that shines... We have to face up some difficulties.

Some stones on the road are:

- When talking to apps outside the app you need a way to investigate how each app behaves for this. 
Ideally, one would want the DCC to come with an interpreter separated from the app executable so that you can feed the interpreter with your scripts and execute them all in the same app instance. That is not what you will encounter most of the time. The executable file is the interpreter as well and different apps can behave differently, even the same app in different operating systems! How do you handle this?

- DCCs come with programming apis in a varied bouquet of flavors. Blender alone is python 3, but a big part of the DCCs come with Python 2 and most recent versions havent made the switch yet; Adobe suite has a customized javascript called ExtendedScript... one of its kind!

- If you plan on communicating between your tools and the apps this communication implies the tools need to know which apps are running, and if this communication is made via sockets you start to think some kind of port manager and some sort of handshaking system is needed to be able to control the app and even be communicating to different instances of apps without executing each time a different process for your tools...

- also, for some apps it is not necessary to be running already inside an instance, you can just launch a standalone process and execute your scripts from there (some packages of the adobe suite) whereas for others you need to be inside the app. This means your system needs some flexibility to adapt to these features while staying still generic.

After this, it's clear that an emphasize on division and compartimentalization of each process acting as client and server is vital as well as handling a clean path for errors and exceptions.. (nobody wants your tools to freeze or stop working because a process raised an exception and you didnt let the others die....furthermore the whole operating system can be jeopardized with duplicate "ghost" processes!)

to be continued.......




sábado, 2 de septiembre de 2017

New Portfolio Website Logo!!!

Some months have past since my last post. Something unusual taking into account that my posting rate during latest couple of years has been on average 1 per month... 

That doesnt mean i havent been doing anything, conversely i ve been quite busy at work. This summer we put all our efforts on the presentation demo our company was showing at Siggraph 2017 which took place in Los Angeles the first week of August. Needless to say we were all surprised by the reception our product had among most industry fellows that showed some interest in us. They gave us some suggestions and improvements to make but the overall balance is pretty good and encouraging and this means....... a lot of work is waiting for us next months!!

So basically, if the summer lasts for 2 months, july and august, my holidays have been barely 2 weeks with the feeling that this short vacation is an "entre-temps" between two very stormy periods.

What happened at Siggraph is just the chick breaking the shell. Next is flying like an eagle!!

Anyways, i always try to dedicate time to doing some art, in the forms that i know. This could be, modeling in maya, zbrush, rigging, vfx... This time i had very pleasant moments playing with Photoshop. Since the times when i prepared the template for my website i wasnt happy with the logo i did (quick sketch in illustrator) but never had time nor the eagerness to improve it until now! :). 

Im not completely satisfied with the look yet, but it is surely an improvement!
Below you can compare both logos

Image A. Old JICEQ logo

Image B. New JICEQ logo

jueves, 2 de marzo de 2017

Simple VFX animation Rig

Some day back in my old days, in the beginning of my new 3d life i was exposed to help riggers and VFX department on how to rig physical properties like vector forces etc with control curves. And i was surprised what an easy task this was and how much of a problem this meant for some people. Although i understand it could have been ages for them since they left school (yes, this is not even university level maths!) you probably didnt get a scientific path in high school. Anyways, im far from being a math nerd myself, and if you are an artist not very familiar with vectors and matrices you will probably discover how surprisingly easy this is.

What we want is basically to control the direction of a vector, for example nucleus' gravity by means of the rotation controlled by a handle/curve control.

So basically this corresponds to rotating a vector by a rotation matrix!

vr = [M].vo

being "vo" the original vector direction and "vr" the rotated vector.


Basically you perform this operation with the vector product node and hook its output in this case right into the axis unit vector of a vortex field.

In the outliner you have this marvellous, beautiful arrow that serves as the possible curve control of a hipothetically more complex part of a rig, which indicates the initial direction of the vector.

And the results are here in a demo video using nParticles and field forces!!
God! that was quick!! i think this is the shortest blog entry ive done so far!!!! and in the middle of a working week!!!!

Hope you enjoyed!