Jump to content

CrazyM

Members
  • Posts

    86
  • Joined

  • Last visited

CrazyM's Achievements

Newbie

Newbie (1/14)

  • Week One Done
  • One Month Later
  • One Year In

Recent Badges

30

Reputation

  1. The length of the bank data is twice that of the OpenAL buffer, is this because the bank is storing the data in two channels even though my audio clip is mono?
  2. Would love to see a bindable key or key sequence in the script editor to toggle comments on selected code blocks.
  3. Wow Ma-Shell that's cool, for some reason I hadn't considered accessing the underlying OpenAL directly. You're right, [alGetSourcei] is what I was referring to. I've been playing with a stand-alone copy of the OpenAL SDK for a few days to get my feet wet and find my bearings. I'm going to run these new ideas through my test environment and see what kind of accuracy I get on my second pass. Thanks for your help!
  4. Is this what you're looking for? http://www.leadwerks.com/werkspace/page/documentation/_/entity/entitysetmass-r74
  5. What I have now is ultra hacky, I asked Josh if he could clarify a few details about the Bank class, which I believe is where audio data is loaded into memory before it's passed to OpenAL. Assuming this is true, this is good planning on his part since it appears windowing the data once it's been passed to OpenAL is a no go. My background in OpenAL only goes back a few days so it's quite possible I'm wrong. So rather that post my hacky mess of code, I'll explain what I'm doing as I understand it. I'm creating a Source and loading a clip into it, I believe Sound::Load probably does some detection and parsing of the sample rate, channels etc, then places the raw data into the Bank class, accessible via sound->data. I don't know what "raw data" actually means here since to me that means the entire wav file with its header, format chunk, and PCM data, but I looked for the header and can't find it. Although OpenAL provides a function for retrieving the index of the sample currently being played, I am unable find Leadwerks API that passes this through. So my first best attempt at an alternative was to convert elapsed play time to percentage complete, and use that to track the approximate current index. With (roughly) the current index in hand, I iterate the bank and pop bytes off the stack. My current issues are that it's not clear to me what format the bank data is in, which means I'm guessing at how best to parse it. I'm also still struggling to fully understand FFT windowing, which I believe solves the excessive noise I'm currently seeing in the signal. I can distinguish sound from silence, but not one amplitude from another.
  6. Yes I'm looking into options for porting it to Leadwerks.
  7. This is a very rough first pass attempt at reading amplitude data from a wav file. I'm hoping to get the signal cleaned up so that I can detect subtle differences in amplitude. Anybody really knowledgeable about the Bank class in regards to audio processing, or using fast fourier transform to smooth data?
  8. Right now you can drag to select/deselect, or right click a vertex for the same effect. It's all a but clunky right now, but I think it could be made into a viable blend shape workflow if a native solution isn't added to the API. The idea would be to load characters into an editor map, and create and save shape information for use by a controller.
  9. I gave a +1 to the other thread requesting this, so I'll +1 this one as well. This is a pretty important feature that would add tons of flexibility.
  10. Thanks, that's my thought as well. Now that I've got a couple of ways to isolate and move vertecies, saving a basis shape and some target shapes should be easy, then it's just a matter of Lerping values. I'd much rather have existing shapes detected and usable in the API, but I wanted to see if this could be a viable approach in case that doesn't happen.
  11. I've been playing around with some ideas for vertex editing, or possibly creating a blend shape/morph/shape key animation system.
  12. Hey macklebee, long time no see speak. Exactly what I was looking for, thank you!
  13. I can't seem to find a conversion method in the API for object space to world space. [surface:GetVertexPosition()] appears to return local coordinates, and I need to convert the output to global coordinates. Thanks
  14. Wow, active discussion in my absence...LOL! I don't find the manually managed source route a bad option and am somewhat ambivalent. It would be nice to simply get a reference from EmitSound and have it's position updates already handled, but it's not really a big deal either way (to me anyway) now that I understand how it works.
×
×
  • Create New...