[ 3 / a / adv / an / asp / cgl / ck / co / diy / fa / g / gd / int / jp / k / lit / m / mlp / mu / n / o / out / p / po / sci / sp / tg / toy / trv / tv / v / vg / vp / vr / w / wsg / x]

/3/ - 3DCG

<< back to board
[Delete this thread]

File: blender-textures.png-(159 KB, 863x494)
Hi. I'm an eternal...
Anonymous 02/22/14(Sat)20:45 UTC+1 No.411178 Report

Hi. I'm an eternal beginner. I'm grappling with Blender.
Searching for answers on the web only makes me run in circles.
And the official Blender documentation is unhelpful too.
My nerves run out.

1. DISPLAY MODE. When I set the 3D View display mode to "Texture", only image-based textures become visible. Procedural textures such as "Clouds" only get visible in "Render" display mode. WHO IS WRONG: ME OR BLENDER?

2. BUMP MAPPING. I want to use a texture (procedural "Clouds") as a bump/normal/displacement map, rather than a colour map. BUT IT'S IMPOSSIBLE TO FIND RELEVANT BLENDER SETTINGS! (Unless someone has already found them, of course.)

3. BAKING NORMALS. I have created two cubes. Both are identical, except one has some sculpted detail on it. Both are UV-unwrapped. I set "Render ? Bake mode ? Normals". I set "Render ? Bake ? Selected to active". I activate the detailed cube; then I activate the plain cube. Then I click "Bake". A texture gets generated. AND IT'S FLAT! Uniform blue picture. No bumps. Baking isn't working, and I don't know why, and I don't know what to type into Google to fix this.

Learning things alone is stupid.
Whoever invented the saying "RTFM" was a certified autist.
Time to ask 4chan.
>>
Anonymous 02/22/14(Sat)21:02 UTC+1 No.411179 Report

>1.
That's because they are generated at rendertime.

>2.
The texture setting is the Normal option under Geometry.

>3.
Change the Normal Space in the bake settings. Make sure to use these settings in th texture settings when you actually use the normal map, also activate Normal Map in the Image Sampling section of the texture settings later.
>>
Anonymous 02/22/14(Sat)21:04 UTC+1 No.411180 Report

>>411178
>BUMP MAPPING
This term is deprecated.
We use normal maps now, since they're technologically superior to bump maps.
>>
Anonymous 02/22/14(Sat)21:14 UTC+1 No.411185 Report

>>411180
>This term is deprecated.
No, the technology is, not the term.
>>
Anonymous 02/22/14(Sat)21:15 UTC+1 No.411186 Report

>>411178

when you say you activate one cuba nad then the other do you
>click a cube
>click the next cube

or do you

>click the detailed cube
>hold shift
>select the other cube

?
? ?
>>
Anonymous 02/22/14(Sat)21:30 UTC+1 No.411190 Report

>>411179
Hmm... "Geometry"... let me find where it is.
I will use your answer as a guide,
then will report how it worked for me. Give me some time.
Thanks.

>>411180
>This term is deprecated.
Thanks. I used it loosely.

>>411186
>click the detailed cube
>hold shift
>select the other cube
THIS IS IT!
Oh, thank you!
>>
Anonymous 02/22/14(Sat)21:42 UTC+1 No.411192 Report

>>
Anonymous 02/22/14(Sat)21:57 UTC+1 No.411196 Report

Only actual images will be displayed outside rendered mode, yes.

The only way around it is to create 2 materials per object, one with with procedurals, the other one with a texture you've assigned to the object for baking, mapped to uvs. And rebake it every time you make changes.
Obviously this isn't very good if you're relying on generated mapping because working on your mesh will modify the geometry, but if you just need some visual feedback on your procedurals without having to enter the cumbersome rendered mode it's ok.
>>
Anonymous 02/22/14(Sat)21:58 UTC+1 No.411198 Report

>>
Anonymous 02/22/14(Sat)22:03 UTC+1 No.411199 Report

>>411198
I would consider to set Normal to a value of about 0.1 to 0.5, values above 1.0 will always look like shit.
>>
Anonymous 02/22/14(Sat)22:12 UTC+1 No.411201 Report

>>411196
Thanks. I don't need that option desperately. But it was the last drop that caused my momentary nervous breakdown. Because, how many things can conspire against you at a time?

I feel better now, I feel saved. It calms me to know that procedural textures are MEANT to be invisible in "Texture" display mode. So it isn't yet another intractable mistake on my part. My life is not over.
>>
Anonymous 02/22/14(Sat)22:25 UTC+1 No.411204 Report

>>411201

y u no cycles
>>
Anonymous 02/22/14(Sat)22:27 UTC+1 No.411205 Report

http://cgcookie.com/blender/

this site right here is all you need to know. It has loads of amazing tutorials. Just search for what you need and remember to take your time to learn well.
>>
Anonymous 02/22/14(Sat)22:37 UTC+1 No.411208 Report

>>411204
>y u no cycles
No GPU. I think "Cycles" is no good without GPU... right? But that's maybe my misconception. When I get adept enough to need computer power, I can rent a "render farm". No need to own a big machine to practice 3D. I think that people don't make good use of the computer power they have.

>>411205
>http://cgcookie.com/blender/
I will check it, thanks.
Although "RTFM" is not a panache.
Human interaction is irreplaceable.
>>
Anonymous 02/22/14(Sat)23:56 UTC+1 No.411220 Report

A new set of weirdo problems...
1. Deleting the images in the "UV Image Editor" doesn't work. I press "Delete" (the cross icon)... and they are still there.
2. The 3D viewport reflects the changes done in the "UV Image Editor" ONLY in the "Rendered" mode. In the "Textures" mode, ONE image remains unupdated. It's the one generated via baking. I HAVE FOUND A STRANGE REMEDY TO THIS: painting something over it in the "UV Image Editor" updates it in the 3D viewport. (Selecting "Undo" afterwards doesn't ruin this remedy.) It works this way... but I feel it's a bug.
3. Thumbnails (icons) of the images in the "UV Image Editor" don't get updated to reflect changes.
4. If I go to the "Textures" panel and link some image to the current texture, this change is reflected in the 3D viewport in the "Rendered" display mode. But not in in the "Textures" display mode! There, it still shows the image linked from the "UV Image Editor".
>>
Anonymous 02/23/14(Sun)02:01 UTC+1 No.411250 Report

>>411208

Cycles is unbiased, physically based, path tracing render engine.

Internal is a biased rasterization engine that works by calculating which objects are visible to the camera and not by simulating the behavior of light.

The difference is HUGE and Cycles is much more accurate.

It's barely about the GPU.
I used Cycles for months while I still had an old Aspire 5737z on CPU

Also, BI or Cycles, you should use Nodes to set up your materials, much more control and flexibility
You will have to learn how to deal with nodes at some point, should start now
>>
Anonymous 02/23/14(Sun)02:31 UTC+1 No.411254 Report

>>411220
When a material, texture or image are left without "users", they are deleted next time you save/load the file.
Pressing X just detaches the object from the "user".
You can check the object's status by opening the list of these objects (where you assign them), and if it has a 0 listed next to it, it means it has no "users" left.
So if you want to remove something, clear the users and save/load. Pressing the F button prevents this, so toggle it off if it's on.

For the others, I think blender has problems with packed textures. Just unpack them (save to disk) and your issues should be gone. Pressing the refresh button in the image selector thing within the texture/material editor should reload the image for both the preview, the viewport and the uv editor.
You could also reload the file, I guess.
Just don't forget to resave the image if you make any modifications from either the uv editor, the "save all edited" button in texture paint mode or the "save paint layers" button from the paint layers addon. If an image is unsaved then the "Image" menu in the uv editor will be displayed as "Image*".

>>411250
Not everyone wants to do renders.
Not everyone needs accurate renders.
And not everyone wants to render in blender.
Nodes are useful, but you don't need to use cycles to use many of them.
>>
Anonymous 02/23/14(Sun)03:23 UTC+1 No.411262 Report

>>411254

Do you have some hate towards Cycles for whatever reasons?
You are throwing a list of things not making any sense, like you are trying hard to start a debate for no reasons

>Not everyone wants to do renders.
>Not everyone needs accurate renders.

All of this guy's post indicates that he's probably trying to render things
Beside, you should know how anyway. I had no use in learning how to create game assets but I did, and it contributed big time in my general 3D knowledge. Learn, everything. Never stop learning.

Are you limiting yourself so much on all the spheres in your life too?

>And not everyone wants to render in blender.

"I'm grappling blender" - OP

Learn rendering and 3D and you know how to render and how to 3D with any tools after a few hours. The goal of this guy is learning 3D, he started with Blender, And again; all his posts indicate that he's trying to render things.

Why are you trying so hard to argue?

>Nodes are useful, but you don't need to use cycles to use many of them.

That's your worst one
"Also, [BOLD] BI or Cycles [/BOLD] , you should use Nodes to set up your materials [...]" - Me
>>
Anonymous 02/23/14(Sun)11:34 UTC+1 No.411340 Report

>>411250
Oh, so "Cycles" is ray-tracing and "Blender" is not... I didn't know this. I thought that all renderers were raytracers nowadays.

>you should use Nodes
I have no spare room in my brain left at the moment
but I will look forward to learn about Nodes
when I'm less overwhelmed.

>>411254
>deleted next time you save/load the file.
Problem II-1 solved :).
>You can check the object's status by opening the list of these objects (where you assign them), and if it has a 0 listed next to it, it means it has no "users" left.
Found it! Pic related ("UV/Image Editor").
I have discovered that the data field disappears entirely, rather than show 0, when no users are left. Oddly enough, clicking "F" makes it instantly 2 – I can never see such value as 1.

>For the others, I think blender has problems with packed textures.
So I'm probably not doing anything wrong... good. Bugs become less of a problem if they are acknowledged.

>>411254
>Not everyone wants to do renders.
>Not everyone needs accurate renders.
>And not everyone wants to render in blender.
>>411262
>all his posts indicate that he's trying to render things.
I'm not sure where my posts indicate it but the >>411254 guy is right. I may want to do renders eventually but my focus at this stage is modelling.

Despite the disagreement between you, I have learned something from you both. Thank you!
>>
Anonymous 02/23/14(Sun)12:42 UTC+1 No.411358 Report

>>411262
I'm sorry, I just skimmed over before writing my reply and kind of wrote that last one thinking you used nodes as a reason to switch to cycles.

I'm not attacking you. I'm not saying you're a retard for using cycles or something. I'm just saying not everyone needs to learn it if they're just making game assets for example.
OP sounds like he started not long ago, so why add something unnecessary to his list of things to do? If BI suits his needs, let him use it. He can already do things with it, so why pressure him into googling for at least several hours to relearn everything he knows how to do now, when it's unclear if he even needs this "accuracy"?
Being overwhelmed with things to learn is the number one reason people stop learning.

I also kind of dislike it when people evangelize cycles so much, because that kind of attitude may one day lead to the removal of BI given how aggressive blender's community is with these things. Devs are apparently sending mixed signals about this, so it may very well happen.
I like BI and don't want to be forced to use only cycles.
It does have some great features, mainly for people who do renders, but not everyone needs them.

>>411340
>Oddly enough, clicking "F" makes it instantly 2 – I can never see such value as 1.
I'm guessing because it's omitted when it only has 1. The uv editor counts as a user, and so does the F button, making it 2.
Pic is what I mean by 0.
>>
Anonymous 02/23/14(Sun)12:44 UTC+1 No.411360 Report

>>411340

The majority of your issues seem to be related to textures and materials which made me believe that you was exploring rendering and not modeling
>>
Anonymous 02/23/14(Sun)13:33 UTC+1 No.411369 Report

>>411360
I'm exploring normal maps, displacement maps and baking as potential aids in modelling. I'm not sure whether everything that I do makes sense. I experiment and learn.

One example of what I'm trying to accomplish:
0. Have a noisy (high-poly) surface.
1. Make a schematic (low-poly) surface.
2. Bake a displacement map from 0 onto 1.
3. Remove noise and correct details manually in Photoshop.
4. Add the edited displacement map back to 1.
5. Profit.
>>
Anonymous 02/23/14(Sun)13:44 UTC+1 No.411373 Report

>>411369
My problem with the described experiment is that I have just discovered that displacement maps are ineffective on low-poly models. To make them effective, such models can remain "schematic" ("low-poly looking") but must get subdivided. Not sure if I like this.

I had expected displacement maps in action to look magically like voxel models or something, without subdividing.

Normal maps work fine on low-poly objects (such as cube).
>>
Anonymous 02/23/14(Sun)17:51 UTC+1 No.411431 Report

Progress.

As a learning experiment, I have applied the displacement map as a modifier rather than as a texture.

NOW THE RESULT IS VISIBLE IN THE "TEXTURES" DISPLAY MODE.
No "Rendered" display mode needed.
(Red box #2 in the pic.)

New problems:
1. The map reflects a differential (tangent) of curvature instead of outright curvature. I'm not sure why. Is it the difference between baking a normal map and a displacement map? Aren't these two interchangeable?
2. The displacement is not only perpendicular to the surface. In some places, it seems parallel to it. Parallel... or something else, I don't know what this is. (Red box #1 in the pic.)

Captcha: general rendermode
>https://archive.is/3UpkF
>>
Anonymous 02/23/14(Sun)18:00 UTC+1 No.411434 Report

>>411431
Forget problem no. 2. Solved. I zoomed on it, it's just how an UV seam edge transitions into a depression mandated by the displacement map.

Problem 3: rectangular "alien" patterns.

They make it look like a Borg spaceship. (Picture.) Why? The displacement map is perfectly smooth there.
>>
Anonymous 02/23/14(Sun)22:12 UTC+1 No.411493 Report

>>411431
Problem III.1 solved:
1. Must not use normal maps as displacement maps.
2. For normal maps, must set "Texture ? Image sampling ? Normal map ? Tangent".

I believe it's possible to convert between normal maps and displacement maps but I will leave it for another occasion.
>>
Anonymous 02/23/14(Sun)22:23 UTC+1 No.411495 Report

I still get lost in the process occasionally but I consider the overall mission of this thread accomplished. I understand normals and displacements and I know how to obtain them.

Posting the result.

Thank you, /3/, for help.
>>
Anonymous 02/23/14(Sun)22:39 UTC+1 No.411496 Report

>>411493
You can't displace with normalmaps, Normal maps contains no information of height, they only hold surface angles (normal vectors).
To extract height data from a normalmap you need an algorithm that picks a point on the normalmap decided it's height
and then moves upslope/downslope converting distance along slope to derive an heightvalue.

This process is computationally expensive, the NVIDIA normalmap filter for photoshop allows you to
reconstruct height data from normalmaps, however it's results are underwhelming.

To displace you need your map a classic bump-map, that is positional data encoded as pixel brightness. Aka Z-depth or heightmap.
>>
Anonymous 02/23/14(Sun)22:46 UTC+1 No.411500 Report

>>411496
I would think that normals are tangents, and that tangents are height differentials:

normal = d/dxy × height.

Something of this sort. It seems simple but now I realise that, even if my formula is correct, this calculation may be very lossy. Thanks for clarifying.
>>
Anonymous 02/23/14(Sun)23:13 UTC+1 No.411509 Report

>>411500
Well, they are normals to the tanget of the implied surface we're looking at in the map, if it's a typical tangentspace normal map.
But the 3 values of the map RGB only stores which direction the normal is pointing, RGB as XYZ - you only get a direction,
and you don't know how long that direction is, so you can't use it to displace since everything has the same length.

Open up a normalmap in photoshop and look at the individual channels and you'll understand why this is.
The red shows what portion of the vector points in the X direction, the green what portion points in the Y direction
and the blue what portion of the vector point out of the screen towards us.
>>
Anonymous 02/23/14(Sun)23:49 UTC+1 No.411516 Report

>>411509
Yeah, I wondered about the exact meaning of colour values in the map. I thought that maybe they represented alpha, beta, gamma of a 3D angle, a quaternion, or something. My comprehension of 3D geometry is rather sketchy.

>only stores which direction the normal is pointing, RGB as XYZ - you only get a direction,
>and you don't know how long that direction is
Hey, if you know XYZ, you automatically know the distance between 000 and XYZ... no? Though I realise this is of little use when there are only 256 possible values for each component.
>>
Anonymous 02/24/14(Mon)00:11 UTC+1 No.411520 Report

>>411509
Also... in HSV colour space, V could correspond directly to distance, and H×S to direction (alpha×beta). So that obtaining a height map would be a matter of converting a normal map to grayscale. Well, almost.

;)
>>
Anonymous 02/24/14(Mon)00:32 UTC+1 No.411524 Report

>>411516
Think of it this way you have a pixel in your normalmap, it's RGB channel values will be number between '0.0' and '1.0'.
Depending on the bitdepth this value can have 128, 256 , 512 levels etc.

Say we have a regular 8-bit per channel map, you have 256 levels of intensity, so let's say our pixel has the value RGB( 25, 176, 78)
Converted to zero to one range this becomes approximently ~ RGB(0.1 , 0.7 , 0.3).

we now map 'RG' into a '-1' to '1' range to get a vector pointing in either positive or negative direction for 'XY'
but we keep 'B' it in the 0 to 1 range for 'Z' component because we don't want the normal to be able to point away from us.

So our RGB(0.1 , 0.7 , 0.3) becomes XYZ(-0.8 ,+0.4 ,+0.3 ).
Now the arrow that now goes from the origin located at XYZ(0,0,0) to the vector XYZ(-0.8 ,+0.4 ,+0.3 ). is the direction of the normal of our surface.
This vector is then normalized, that is doing a mathematical operation that make it unit length; make it have the length of 1 but still point in exactly the same direction.
>>
Anonymous 02/24/14(Mon)00:49 UTC+1 No.411528 Report

>>411495
Displacement is only for Z-depth, which is why you get that break in the mesh.
>>
Anonymous 02/24/14(Mon)01:09 UTC+1 No.411529 Report

>>411528
>Displacement is only for Z-depth, which is why you get that break in the mesh.
Yes... that's what I learned by doing this experiment!
:-)

>>411524
Oh, it's normalised to unity! I see...

Is the XYZ global or local? I guess it can be any, judging by some options I have seen in Blender. If it's local, then XY = UV... I guess...
>>
Anonymous 02/24/14(Mon)15:28 UTC+1 No.411606 Report

Playing with displacement modifier in Blender.

Using a simple plane as a 3D canvas.

The plane is a quad subdivided 6 times (max), then subdivided again, in this case 1 time.

I have found that it's impossible to paint displacement in real time, I must refresh it manually by turning visibility off and on whenever I want to see effect. (Red box in the picture.)

It's okay.

P.S. This is one example how someone might want to use textures for modelling and not for rendering.
>>
Anonymous 02/24/14(Mon)15:40 UTC+1 No.411609 Report

Is it possible to apply textures to NURBS surfaces and metaballs?

That would be awesome.

I don't count on displacement maps, but at least normal maps and colour maps.

This answer should be easy to find on my own but I want a lunch first.
>>
Noradninja 02/24/14(Mon)22:16 UTC+1 No.411650 Report

>>411609
http://www.youtube.com/watch?v=83vblPR7Xgs
>>
Anonymous 02/24/14(Mon)22:23 UTC+1 No.411652 Report

>>411650
>http://www.youtube.com/watch?v=83vblPR7Xgs
Oh, Maya... nice.
Let me watch the whole video.
>>
Anonymous 02/24/14(Mon)23:15 UTC+1 No.411658 Report

>>411529
Yeah, this is sometimes called normalization. Divide the vector components each by the magnitude. Or multiply by the inverse square root of the sum of the squared X Y and Z. If you have heard of fast inverse square root, it is used for this.

These are tangent space normal maps you are working with. The most used kind. So XYZ is is in tangent space, coming out of the face.
>>411606
It's possible to use this method to add detail to a model but a better process might be sculpting. Although in some cases the displacement method can make things easier.
>>411609
These surfaces are generated as you go so there is no way to UV map. However Procedural mapping should work.
>>
Anonymous 02/24/14(Mon)23:21 UTC+1 No.411660 Report

>>411609
^^Me again hopefully, I was mostly just talking about mapping meta surfaces, which you should avoid. Not enough experience with nurbs to talk about them though.
>>
Anonymous 02/25/14(Tue)00:36 UTC+1 No.411671 Report

>>411658
>It's possible to use this method to add detail to a model but a better process might be sculpting.
I know. I'm looking for image making techniques that are versatile and enable good end result without requiring massive computer resources. Sculpting is not one of them. Although I may use them too, I don't want to be dependant on them.
>>
Anonymous 02/25/14(Tue)04:26 UTC+1 No.411687 Report

>>411671
Oh, if computer performance is an issue this is definitely a solution. Just get as high as you can go through sculpting. Then add finer details through painting to be displaced at render time. If they are small or concave enough you may even just bake to a normal map and not displace.
>>
Anonymous 02/25/14(Tue)07:42 UTC+1 No.411702 Report

>>411358

>I also kind of dislike it when people evangelize cycles so much, because that kind of attitude may one day lead to the removal of BI given how aggressive blender's community is with these things. Devs are apparently sending mixed signals about this, so it may very well happen.

Ain't gonna happen. Brecht more or less abandoned BI when he wrote Cycles, but there are other people actively working on it (bringing GI to it, to be specific). A new mailing list for non-photorealistic rendering was just started last week, which focuses entirely on BI.

One of two things is going to happen: either BI will stay more or less the same, with only bugfixes to keep it working, or it'll finally be refactored as part of the BEER project (search for blender beer npr for more info on BEER). Cycles will continue to gain functionality (it has some baking support now, and volumetrics is partially working), but it's a path tracer and not everyone wants path tracing.

I'm hoping for a refactor myself - I don't use BI much but I'm glad it's there. Brecht started Cycles after giving up on fixing BI during the Sintel project though, so it'll be a long time and a lot of work before it happens.

(I'm not the guy you were replying to, BTW, I just try to keep up with Blender development)
>>
Anonymous 02/25/14(Tue)07:51 UTC+1 No.411704 Report

>>411373

Yep, displacement maps can only move vertexes. There are tons of uses for them, but if you're wanting a face that doesn't appear completely flat (after triangulation), then you need normal maps.

One use for this is if you're creating a lot of something (say, a battlefield with lots of tanks), you can keep the same displacement maps but lower the subdivision level on objects that are further away from the camera. You can also create wave patterns, rocks, etc. by using procedural textures as your displacement maps.
All the content on this website comes from 4chan.org. All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster. 4chanArchive is not affiliated with 4chan.