Error using WASAPI with PortAudio on Win7 - portaudio

I'm trying to use PortAudio and libsndfile to play .wav files in exclusive mode on my Windows 7 machine, but I'm getting
error number -9984 "Incompatible host API specific stream info" .
I've filled out the PaWasapiStreamInfo struct as follows:
struct PaWasapiStreamInfo wasapiInfo ;
wasapiInfo.size = sizeof(PaWasapiStreamInfo);
wasapiInfo.hostApiType = paWASAPI;
wasapiInfo.version = 1;
wasapiInfo.flags = paWinWasapiExclusive;
wasapiInfo.channelMask = NULL;
wasapiInfo.hostProcessorOutput = NULL;
wasapiInfo.hostProcessorInput = NULL;
wasapiInfo.threadPriority = eThreadPriorityProAudio;
Then assigning the hostApiSpecificStreamInfo parameter and opening the stream via Pa_OpenStream as follows:
/* stereo or mono */
out_param.channelCount = sfinfo.channels;
out_param.sampleFormat = paInt16;
out_param.suggestedLatency = _GetDeviceInfo(out_param.device)->defaultLowOutputLatency;
out_param.hostApiSpecificStreamInfo = (&wasapiInfo);
err = Pa_OpenStream(&stream, NULL, &out_param, sfinfo.samplerate,
paFramesPerBufferUnspecified, paClipOff,
output_cb, file);
Have I missed a step?
Thanks,
Tyler

The technique you used to run the stream in exclusive mode worked for me. It may be the case that you're not opening a stream on a WASAPI device. Depending on your system configuration you may have DirectSound and WMME devices as well. The following code will verify whether the device referenced by index deviceIndexis a WASAPI device or not:
bool isWasapi = Pa_GetHostApiInfo(Pa_GetDeviceInfo(deviceIndex)->hostApi)->type == paWASAPI;
You also need to specify the same index in the out_param struct:
out_param.device = deviceIndex;
You did couple things I did not. In your example you tried to set the thread priority, but PortAudio documentation states that the following line:
wasapiInfo.threadPriority = eThreadPriorityProAudio;
will have no effect because you didn't not set the paWinWasapiThreadPriority bit in wasapiInfo.flags. By the same rule it is unnecessary to explicitly set the other varaibles to null. To fix this set wasapiInfo.flags as follows:
wasapiInfo.flags = (paWinWasapiExclusive|paWinWasapiThreadPriority)
This should enable exclusive mode and cause the threadPriority variable to take effect.

Related

Differentiating Mode 2 Form 1 from Mode 2 Form 2 on XA CD-ROMs?

I'm developing a library for reading CD-ROMs and the ISO9660 file system.
Long story short, pretty much everything is working except for one thing I'm having a hard time figuring out how it's done:
Where does XA standard defines differentiation among Mode 2 Form 1 from Mode 2 Form 2?
Currently, I am using the following pseudo-code to differentiate between both forms; albeit it's a naive heuristic, it does work but well, it's far from ideal:
var buffer = ... // this is a raw sector of 2352 bytes
var m2F1 = ISector.Cast<SectorMode2Form1>(buffer);
var edc1 = EdcHelper.ComputeBlock(0, buffer, 16, 2056);
var edc2 = BitConverter.ToUInt32(m2F1.Edc, 0);
var isM2F1 = edc1 == edc2;
if (isM2F1) return CdRomSectorMode.Mode2Form1;
// NOTE we cannot reliably check EDC of M2F2 since it's optional
var isForm2 =
m2F1.SubHeaderCopy1.SubMode.HasFlag(SectorMode2Form1SubHeaderSubMode.Form2) &&
m2F1.SubHeaderCopy2.SubMode.HasFlag(SectorMode2Form1SubHeaderSubMode.Form2);
if (isForm2) return CdRomSectorMode.Mode2Form2;
return CdRomSectorMode.Mode2Formless;
If you look at some software like IsoBuster, it appears to be a track-level property, however, I'm failing to understand where the value would be read from within the track.
I'm actually doing something similar in typescript for my ps1 mod tools. It seems like you actually probably have it correct here, since I'm going to assume your HasFlag check is checking position bit position 6 of the subheader. If that flag is set, you are in form 2.
So what you probably want something like:
const sectorBytes = new Uint8Arrray(buffer);
if (sectorBytes[0x012] & 0x20) === 0x20) {
return CdRomSectorMode.Mode2Form2;
} else {
return CdRomSectorMode.Mode2Form1;
}
You could of course use the flag code you already have, but that would require you to use one of the types first to get that. This just keeps it generic bytes and checks the flag, then returns the relevant mode.

vlcj media option "--no-overlay" doesn't work?

I would like to turn off vlc's hardware acceleration option to avoid some lagging issue caused by a graphic card's driver bug. I tried to pass in that option in the prepareMedia method. That didn't help (as it would when I did it through command line: vlc --no-overlay 'path-to-video'). It actually even seemed to make the playback a bit more laggy. Below is part of my code to set up the player. I actually tried playMedia("path-to-video","--no-overlay") and that didn't work either.
mediaPlayerComponent = new EmbeddedMediaPlayerComponent();
player = mediaPlayerComponent.getMediaPlayer();
...
player.prepareMedia("path-to-video","--no-overlay");
Some of those options must be passed when creating the MediaPlayerFactory rather than when playing the media - as to why it's like this, well it's just how LibVLC works.
If you're using EmbeddedMediaPlayerComponent you can do something like this to supply those options:
mediaPlayerComponent = new EmbeddedMediaPlayerComponent() {
protected String[] onGetMediaPlayerFactoryArgs() {
return new String[] {"--no-overlay"};
}
}
Note that this will replace the default media player factory arguments so you might like to specify some other ones too - these are the defaults:
protected static final String[] DEFAULT_FACTORY_ARGUMENTS = {
"--video-title=vlcj video output",
"--no-snapshot-preview",
"--quiet-synchro",
"--sub-filter=logo:marq",
"--intf=dummy"
};
So that is how you set such native VLC options, but whether this particular option will do what you actually want (and without any other side effects) is another matter.

limitation for NumberOfElements in scatter/gather list

My device driver for a PCIe FPGA is based on 7600.16385.1\src\general\PLX9x5x
Upon ReadFile in the application, PLxEvtIoRead is called:
//
// Initialize this new DmaTransaction.
//
status = WdfDmaTransactionInitializeUsingRequest(
devExt->ReadDmaTransaction,
Request,
PLxEvtProgramReadDma,
WdfDmaDirectionReadFromDevice );
//
// Execute this DmaTransaction.
//
status = WdfDmaTransactionExecute( devExt->ReadDmaTransaction,
WDF_NO_CONTEXT);
....
Upon calling to WdfDmaTransactionExecute, PLxEvtProgramReadDma is called.
BOOLEAN
PLxEvtProgramReadDma(
IN WDFDMATRANSACTION Transaction,
IN WDFDEVICE Device,
IN WDFCONTEXT Context,
IN WDF_DMA_DIRECTION Direction,
IN PSCATTER_GATHER_LIST SgList
)
{
KdPrint ((???SgList->NumberOfElements = %d\n???,SgList->NumberOfElements));
}
The problem:
i want to transfer a large amount of data via this Scatter/Gather list(around 1 GB), but it seems NumberOfElements is limited by something, somehow that the larges transmition is 1MB(255 element in list, each 4k). i changed MaximumTransfecrLength in function below to 500MB:
WDF_DMA_ENABLER_CONFIG_INIT(&dmaConfig,
WdfDmaProfileScatterGatherDuplex,
deviceContext->MaximumTransferLength);
but still i can not transfer more than 1MB.
what is the thing that limits NumberOfElements and how i can solve it?
I needed to change the second parameter in WDF_DMA_ENABLER_CONFIG_INIT function to WdfDmaProfileScatterGather64, and of course we have to make sure that hardware(FPGA or anything in other side of PCIE endpoint) can support 64-bit addressing mode.
I just change my code as below:
WDF_DMA_ENABLER_CONFIG_INIT(&dmaConfig,
WdfDmaProfileScatterGather64,
deviceContext->MaximumTransferLength);

Trying to connect to Ticket Printer using VB6 Winsock

I am trying to send data from a VB6 program to a ticket printer Via TCP/IP. The only VB6 way I have found to try and do this is using the WinSock Control.
I use the following code to connect
WinSock.Protocol = sckTCPProtocol
WinSock.RemoteHost = txtIPAddress.Text
WinSock.RemotePort = txtPort.Text
WinSock.Connect
And then try and send the data as follows
WinSock.SendData ("<F8>" & txtPrint.Text & "<p>")
Everytime I try and do this, it fails because the Winsock.State is 6 (Connecting). This just stays at connecting and never connects or fails. I am able to connect to the printer using this IP/Port combo outside of VB6. Is there anything I may be doing wrong? Can the WinSock control do this?
In a .net program provided, this seems to be accomplished by doing the following:
CONNECT
client = new TcpClient(ip_address, 9100);
s = client.GetStream(); //s is System.Net.Sockets.NetworkStream
s.ReadTimeout = 500; //attempt to read for up to 0.5 seconds
sr = new StreamReader(s); //create read stream
sw = new StreamWriter(s); //create write stream
sb = new BinaryWriter(s); //create binary stream
sw.AutoFlush = true; //set write stream to flush data when < full buffer
SEND:
sw.WriteLine(command);
Thank you.
You are missing up concepts. I remember this from 15 years ago.
Winsock is for work with a protocol. You must know the printer protocol. Is not just sample text.

How do I set up DirectX 9 so that backface culling is off, z-buffering is on, and gouraud shading works, for triangle meshes without normals data?

I've been having difficulty identifying the correct parameters for the PresentParameters and DirectX device, so that there can be both vertex-level gouraud shading and the use of a z buffer. Some triangle meshes work fine, others have background triangles appearing in front of triangles which are closer-to-camera.
An example of this is found here: http://gallery.me.com/robert.perkins/100045/zBufferGone. The input data is a simple list of vertices in facets. The winding order of the vertices in each facet is nondeterministic (comes from various CAD software export functions) and there is no normals data.
The PresentParameters are being set up right now as follows. I realize this is C# instead of C++ but I think it's descriptive enough, and the parameters pass through to C++ code. This produces the image in the picture; the behavior is the same on the Reference device:
pParams = new PresentParameters()
{
BackBufferWidth = this.ClientSize.Width,
BackBufferHeight = this.ClientSize.Height,
AutoDepthStencilFormat = Format.D16,
EnableAutoDepthStencil = true,
SwapEffect = SwapEffect.Discard,
Windowed = true
};
_engineDX9 = new EngineDX9(this, SlimDX.Direct3D9.DeviceType.Hardware, SlimDX.Direct3D9.CreateFlags.SoftwareVertexProcessing, pParams);
_engineDX9.DefaultCamera.NearPlane = 0;
_engineDX9.DefaultCamera.FarPlane = 10;
_engineDX9.D3DDevice.SetRenderState(RenderState.Ambient, false);
_engineDX9.D3DDevice.SetRenderState(RenderState.ZEnable, ZBufferType.UseZBuffer);
_engineDX9.D3DDevice.SetRenderState(RenderState.ZWriteEnable, true);
_engineDX9.D3DDevice.SetRenderState(RenderState.ZFunc, Compare.Always);
_engineDX9.BackColor = Color.White;
_engineDX9.FillMode = FillMode.Solid;
_engineDX9.CullMode = Cull.None;
_engineDX9.DefaultCamera.AspectRatio = (float)this.Width / this.Height;
All of my other setup attempts, even on the reference device, return a COM error code ({"D3DERR_INVALIDCALL: Invalid call (-2005530516)"}). What are the correct setup parameters?
EDIT: The C++ class which interfaces with DirectX9 sets defaults like this:
PresentParameters::PresentParameters()
{
BackBufferWidth = 640;
BackBufferHeight = 480;
BackBufferFormat = Format::X8R8G8B8;
BackBufferCount = 1;
Multisample = MultisampleType::None;
MultisampleQuality = 0;
SwapEffect = SlimDX::Direct3D9::SwapEffect::Discard;
DeviceWindowHandle = IntPtr::Zero;
Windowed = true;
EnableAutoDepthStencil = true;
AutoDepthStencilFormat = Format::D24X8;
PresentFlags = SlimDX::Direct3D9::PresentFlags::None;
FullScreenRefreshRateInHertz = 0;
PresentationInterval = PresentInterval::Immediate;
}
Where does it return an invalid call?
Edit: I'm assuming in the new EngineDX9 call? Have you tried setting a device window handle in the present parameters?
Edit 2: Have you turned on the debug spew in the DirectX control panel to see whether it tells you what the error is?
Edit3: You have tried setting backbufferWidth and Height to 0? What is backbuffercount set to? Might also be worth trying "Format.D24S8" on the backbuffer? Its "possible" your graphics card doesn't support 16-bit (unlikely though). Have you checked in the caps that the mode you are trying to create is valid? I asssume, btw, that the CLR language you are using automagically sets the parameters you don't set to 0? I,personally, always prefer to be explicit in such cases ....
PS I'm guessing here because im a native C++ DX9 coder not a CLR SlimDX coder ...
Edit4: I'm sure its the lack of window handle ... I'm probably wrong but thats the only thing i can see REALLY wrong with your setup. A windowed DX9 device requires a window. Btw set width and height to 0 to just use the window you are setting the device too's size ...
Edit 5: I've really been heading down the wrong route here. There is nothing wrong with the creation of the device that produced your "incorrect" device. Do not mess with the present parameters they are fine. The main reason you'll have problems with your Z-Buffering is that you set the compare function to always. This means that, regardless of what the z-buffer contains, pas the pixel and write its z into the z-buffer overwriting whatever is there already. I'd wager therein lies your Z-buffering problem.

Resources