I have this piece of code that was being used like this
var _loc2 = new Color("_level1.shellContainer.INTERFACE.BALLOONS.p" + _loc3 + ".balloon_mc");
_loc2.setRGB(_loc4);
I want to apply a glow filter to _level1.shellContainer.INTERFACE.BALLOONS.pLOC3.balloon_mc but I don't know how because I cant access balloon_mc because it needs _loc3 after BALLOONS.p and I don't know how you could add loc3 to this. If someone could tell me howto add loc3 or howto add glowfilter without having to do that it would be great. I'm also using actionscript 2.
I'm pretty sure that's not the correct way to do this, but as you only posted one line of code, i can't help you on this issue.
However, since you seem to want to change the colors of an object with a glowfilter, you can just use it like this:
object.filters = new Array(new GlowFilter(0xFFFFFF,1, 6, 6, 9, BitmapFilterQuality.LOW, false, overlap));
The parameters are:
GlowFilter(color:uint = 0xFF0000, alpha:Number = 1.0, blurX:Number = 6.0,
blurY:Number = 6.0, strength:Number = 2, quality:int = 1, inner:Boolean = false,
knockout:Boolean = false)
alpha : Number
The alpha transparency value for the color.
blurX : Number
The amount of horizontal blur.
blurY : Number
The amount of vertical blur.
color : uint
The color of the glow.
inner : Boolean
Specifies whether the glow is an inner glow.
knockout : Boolean
Specifies whether the object has a knockout effect.
quality : int
The number of times to apply the filter.
strength : Number
The strength of the imprint or spread.
See : http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/filters/GlowFilter.html
Try
new Color(_level1.shellContainer.INTERFACE.BALLOONS['p' + _loc3].balloon_mc);
Related
I'm making a sheet with details about a bunch of fictional characters, and one column I want to have is their height. I would also really like to use Conditional Formatting with a Color Scale to color-code the tallest and shortest characters, and everything in between.
Unfortunately, I live in the US, and am used to height expressed in feet and inches (e.g. 5'10''), which Google Sheets of course does not recognize as a number. Is there any way to remedy this, besides writing everything in terms of just inches (e.g. 60), such that I could apply conditional formatting directly to the column?
I've tried different formats (e.g. 5'10), and I considered having a hidden column with just the inch value and have conditional formatting work off of that row (doesn't work with Color Scale as far as I can tell, since you can't input a custom formula). One thought I had is somehow formatting things as an improper fraction with a denominator of 12, but hiding the denominator? But I have no idea how that would work. I've Googled as best I can, but I haven't found anything (everything's just about changing row height, which makes sense in hindsight).
I understand that you have two goals in mind. First of all, you should decide which unit length to use for managing heights. I have chosen inches, but you could work with feet if you need. This will simplify the scenario and will allow you to work easily with the data, but you could always create a function that translates inches to the foot/inches combo in order to show the data to a third party. This is the example table that I will use:
And this is my code, I will explain it at the bottom:
function main() {
var sheet = SpreadsheetApp.getActiveSheet();
var data = sheet.getDataRange().getValues();
data = sortTable(data);
sheet.getDataRange().setValues(data);
for (var i = 1; i < data.length; i++) {
data[i][2] = gradient(data.length, i);
}
for (var i = 1; i < data.length; i++) {
sheet.getRange(i, 2).setBackground("#" + data[i][2][0] + data[i][2][1] +
data[i][2][2]);
}
}
function sortTable(data) {
data.sort(function(a, b) {
return b[1] - a[1];
})
return data;
}
function gradient(arraySize, position) {
var relativePosition = position / arraySize;
var topColor = [parseInt("00", 16), parseInt("7A", 16), parseInt("33",
16)]; // Green
var bottomColor = [parseInt("FF", 16), parseInt("FF", 16), parseInt("FF",
16)]; // White
var positionColor = [0, 0, 0];
for (var i = 0; i < 3; i++) {
positionColor[i] = Math.floor(topColor[i] * (1 - relativePosition) +
bottomColor[i] * relativePosition).toString(16);
}
return positionColor;
}
First of all you have to read the data with a combination of getValues()/setValues(), and once you do that you can sort the table based on height so you can create the gradient later. Please notice how I separated the sorting function for better clarity.
After that you need the gradient color for setBackground(). To do so I developed a simple linear gradient function that calculates the RGB code from the top to the bottom. In my example the gradient fades from green to white, but you can change it. I also separated the gradient script into its own function. At this point you already have the sorted table and its gradient colors, so you only have to use setValues() and you are done. Feel free to leave any comment if you have doubts about this approach. This would be the final result:
UPDATE
Based in your comments I get that you need an imperial height format. For that case, you could use =INT(B2)&"' "&TRIM(TEXT(ROUND(MOD(B2,1)*12*16,0)/16,"# ??/??")&"""") (assuming that B2 contains the height). This approach will use Sheets Formulas to calculate the remainder part of the height, and its expression as an irreducible fraction. This is the final result:
I have a strange issue when using two RenderTargets in SharpDX, using DX11.
I am rendering a set of objects that can be layered, and am using blend modes to achieve partial transparency. Rendering is done to two render targets in a single pass, with the second render target being used as a colour picker - I simply render the object ID (integer) to this second target and retrieve the object ID from the texture under the mouse after rendering.
The issue I am getting is frustrating, as it does not happen on all computers. In fact, it doesn't happen on any of our development machines but has been reported in the wild - typically on machines with integrated Intel (HD) graphics. On these computers, no output is generated in the second render target. We have been able to reproduce the problem on a laptop here, and if we don't set the blend state, then the issue is resolved. Obviously this isn't a fix, since we need the blending.
The texture descriptions for the main render target (0) and the colour picking target look like this:
var desc = new Texture2DDescription
{
BindFlags = BindFlags.RenderTarget | BindFlags.ShaderResource,
Format = Format.B8G8R8A8_UNorm,
Width = width,
Height = height,
MipLevels = 1,
SampleDescription = sampleDesc,
Usage = ResourceUsage.Default,
OptionFlags = RenderTargetOptionFlags,
CpuAccessFlags = CpuAccessFlags.None,
ArraySize = 1
};
var colourPickerDesc = new Texture2DDescription
{
BindFlags = BindFlags.RenderTarget,
Format = Format.R32_SInt,
Width = width,
Height = height,
MipLevels = 1,
SampleDescription = new SampleDescription(1, 0),
Usage = ResourceUsage.Default,
OptionFlags = ResourceOptionFlags.None,
CpuAccessFlags = CpuAccessFlags.None,
ArraySize = 1,
};
The blend state is set like this:
var blendStateDescription = new BlendStateDescription { AlphaToCoverageEnable = false };
blendStateDescription.RenderTarget[0].IsBlendEnabled = true;
blendStateDescription.RenderTarget[0].SourceBlend = BlendOption.SourceAlpha;
blendStateDescription.RenderTarget[0].DestinationBlend = BlendOption.InverseSourceAlpha;
blendStateDescription.RenderTarget[0].BlendOperation = BlendOperation.Add;
blendStateDescription.RenderTarget[0].SourceAlphaBlend = BlendOption.SourceAlpha;
blendStateDescription.RenderTarget[0].DestinationAlphaBlend = BlendOption.InverseSourceAlpha;
blendStateDescription.RenderTarget[0].AlphaBlendOperation = BlendOperation.Add;
blendStateDescription.RenderTarget[0].RenderTargetWriteMask = ColorWriteMaskFlags.All;
_blendState = new BlendState(_device, blendStateDescription);
and is applied at the start of rendering. I have tried explicitly setting IsBlendEnabled to false for RenderTarget[1] but it makes no difference.
Any help on this would be most welcome - ultimately, we may have to resort to making two render passes but that would be annoying.
I have now resolved this issue, although exactly how or why the "fix" works in not entirely clear. Hat-tip to VTT for pointing me to the IndependentBlendEnable flag in the BlendStateDescription. Setting that flag on its own (to true), along with RenderTarget[1].IsBlendEnabled = false, was not enough. What worked in the end was filling a complete set of values for RenderTarget[1], along with the aforementioned flags. Presumably all other values in the second RenderTarget would be ignored, as blend is disabled, but for some reason they need to be populated. As mentioned before, this problem only appears on certain graphics cards so I have no idea if this is the correct behaviour or just a quirk of those cards.
I'm trying to draw a Circle around every kind of geometry (could be every ol.geom type: point,polygon etc.) in an event called on 'postcompose'. The purpose of this is to create an animation when a certain feature is selected.
listenerKeys.push(map.on('postcompose',
goog.bind(this.draw_, this, data)));
this.draw_ = function(data, postComposeRender){
var extent = feature.getGeometry().getExtent();
var flashGeom = new ol.geom.Polygon.fromExtent(extent);
var vectorContext = postComposeRender.vectorContext;
...//ANIMATION CODE TO GET THE RADIUS WITH THE ELAPSED TIME
var imageStyle = this.getStyleSquare_(radius, opacity);
vectorContext.setImageStyle(imageStyle);
vectorContext.drawPolygonGeometry(flashGeom, null);
}
The method
drawPolygonGeometry( {ol.geom.Polygon} , {ol.feature} )
is not working. However, it works when I use the method
drawPointGeometry({ol.geom.Point}, {ol.feature} )
Even if the type of flashGeom is
ol.geom.Polygon that I just built from an extent. I don't want to use this method because extents from polygons could be received and it animates for every point of the polygon...
Finally, after analyzing the way drawPolygonGeometry in OL3 works in the source code, I realized that I need to to apply the style with this method before :
vectorContext.setFillStrokeStyle(imageStyle.getFill(),
imageStyle.getStroke());
DrawPointGeometry and drawPolygonGeometry are not using the same style instance.
I am trying to somehow replicate the range bar chart here.
I've found this reference but I don't fully grasp the code.
What I have is a series of task (sometimes accomplished in different periods).
let d = [("task1", DateTime.Parse("11/01/2014 08:30"), DateTime.Parse("12/01/2014 10:30"));
("task2", DateTime.Parse("15/01/2014 09:30"), DateTime.Parse("16/01/2014 10:30"));
("task3", DateTime.Parse("11/01/2014 08:30"), DateTime.Parse("16/01/2014 10:30"))]
let chart = d |> FSharp.Charting.Chart.RangeBar
chart.ShowChart()
I am struggling to understand the logic of the API.
I have also tried:
let chart = new Windows.Forms.DataVisualization.Charting.Chart(Dock = DockStyle.Fill)
let area = new ChartArea("Main")
chart.ChartAreas.Add(area)
let mainForm = new Form(Visible = true, TopMost = true, Width = 700, Height = 500)
mainForm.Controls.Add(chart)
let seriesColumns = new Series("NameOfTheSerie")
seriesColumns.ChartType <- SeriesChartType.RangeBar
type SupportToChart(serieVals: Series) =
member this.addPointXY(lbl, [<ParamArray>] yVals: Object[]) =
serieVals.Points.AddXY(lbl, yVals) |> ignore
let supporter = SupportToChart(seriesColumns)
supporter.addPointXY("AAA", DateTime.Parse("11/01/2014 08:30"), DateTime.Parse("12/01/2014 10:30") )
which results in
System.ArgumentOutOfRangeException: You can only set 1 Y values for
this data point.
Has something changed in the API since then?
I'm not entirely sure that F# Charting is currently powerful enough to be able to reconstruct the above chart. However, one of the problems seems to be that it treats dates as float values (for some reason) and incorrectly guesses the ranges. You can at least see the chart if you use:
Chart.RangeBar(d)
|> Chart.WithYAxis(Min=41650.0, Max=41660.0)
Please submit this as an issue on GitHub. If you want to dig deeper into how F# Charting works and help us get this fixed, that would be amazing :-)
The trick is initializing the Series with
let serie = new Series("Range", yValues)
where yValues defines the max number of "Y-values".
I've been unsuccessful at getting a simple cube geometry with shading turned on to display correctly.
This is c# code, but the values are being passed through SlimDX directly to C++ code.
pParams.BackBufferWidth = 0;
pParams.BackBufferHeight = 0;
pParams.BackBufferCount = 1;
pParams.BackBufferFormat = Format::X8R8G8B8;
pParams.Multisample = MultisampleType::None;
pParams.MultisampleQuality = 0;
pParams.DeviceWindowHandle = this.Handle;
pParams.Windowed = true;
pParams.AutoDepthStencilFormat = Format.D24X8;
pParams.EnableAutoDepthStencil = true;
pParams.PresentFlags = PresentFlags.None;
pParams.FullScreenRefreshRateInHertz = 0;
pParams.PresentationInterval = PresentInterval.Immediate;
pParams.SwapEffect = SwapEffect.Discard;
... are the values in the PresentParameter struct used to set up my Direct3D9Device object.
During a rendering, SetRenderState is called as follows:
this.D3DDevice.Clear(ClearFlags.Target | ClearFlags.ZBuffer, this.BackColor, 10000.0f, 0);
this.D3DDevice.SetRenderState(RenderState.Ambient, false);
this.D3DDevice.SetRenderState(RenderState.ZEnable, ZBufferType.UseZBuffer);
this.D3DDevice.SetRenderState(RenderState.ZWriteEnable, true);
this.D3DDevice.SetRenderState(RenderState.ZFunc, Compare.LessEqual);
this.D3DDevice.BeginScene();
Again, this is passed through to C++ code, which marshals the values in to calls a C++ programmer would not fear.
The primitives are diffuse colored vertices (D3DFVF_XYZ | D3DFVF_DIFFUSE). The wireframe view looks like this:
wireframe view http://gallery.me.com/robert.perkins/100045/z-fightingwireframe/web.jpg
The nearer pair of larger triangles is the near face of a cube.
The filled view looks like this:
full view 1 http://gallery.me.com/robert.perkins/100045/Z-fighting/web.jpg
Or this, on a subsequent rendering call:
full view 2 http://gallery.me.com/robert.perkins/100045/zfight2/web.jpg
I'm not sure how to fix this. Where should I begin looking?
Edit: The camera projection matrix looks about like this for one of the frames:
{[[M11:0.6281456 M12:0.7659309 M13:0.1370506 M14:0]
[M21:0.7705086 M22:-0.5877584 M23:-0.2466911 M24:0]
[M31:-0.1083957 M32:0.2605566 M33:-0.9593542 M34:0]
[M41:-3.225646 M42:-1.096823 M43:20.91392 M44:1]]}
And, the view matrix looks like this:
camera.ViewMatrix = {[[M11:0.6281456 M12:0.7659309 M13:0.1370506 M14:0]
[M21:0.7705086 M22:-0.5877584 M23:-0.2466911 M24:0]
[M31:-0.1083957 M32:0.2605566 M33:-0.9593542 M34:0]
[M41:-3.225646 M42:-1.096823 M43:20.91392 M44:1]]}
Clear the Z-Buffer to 1.0f not 10000.0f.
From the Clear docs in the SDK:
[in] Clear the depth buffer to this new z value which ranges from 0 to 1..
It may also be useful to see your projection matrix and viewport settings ...
Edit: How do you build that projection matrix? You have set zNear to 0 and zFar to 1. Try setting your zNear to 0.001f and zFar to 1000.0f and see whether that helps you at all...
A hunch: Try enabling the Z-Buffer before you clear.