1024x768 resolution with nearest neighbor scaling - nvidia

I'm using Arch Linux with Cinnamon and a 4K or rather UHD screen (3840x2160) and would like to use a 1024x768 resolution with black bars on
the left and right side and nearest neighbor scaling instead of bilinear filtering which is the default for all monitors that I've ever seen.
There's ways to kind of get this to work by actually running 3840x2160 but rendering a 1024x768 and scaling it up.
This can be done with xrandr or nvidia-settings.
I also managed to get some black bars going.
So what worked best for me so far was this command:
nvidia-settings -a CurrentMetaMode="DP-2: 3840x2160_60 {ViewPortIn=1024x768, ViewPortOut=2880x2160+480+0, ResamplingMethod=Nearest}"
This gives me the crisp upscaling and black bars.
There's one problem though: The right side of the screen is "cut off".
Which means when I maximize windows it acts as if I was still using a 16:9 resolution, rendering the right part of let's say a browser not accessible.
In games that scroll by putting the mouse at the edge of the screen it makes this not work for the right side, while working on the left, top and bottom.
Does anyone know about this problem or has a better solution?
I'm open to anything, like for example using some WINE settings to pull this off or something. Since this is mainly for playing old games a completely different approach with WINE would be totally fine to solve the problem.
I've already tried all kinds of things over the last few days. At this point I would jump into the air if somebody knows of a way to get this to work.

Related

Unity 2D: after-image from OLED screens in a high contrast situation

When I test my unity 2D game on my iPhone X, all background and sprite elements on the screen have a blue "halo" when moving my character. I have explored the issue with transparency on mobile, but the issue seems really strange. The blue halo appears only when the background is black. Anything brighter and it is absolutely fine. So I doubt it's a transparency issue given that it appears only when a dark background is present.
It is visible only on mobile, so taking a screenshot is useless.
If anyone wants to test do the following. Download or open the image attached here to full screen. Zoom in just a bit so the shapes are taking most of the screen. Start moving the image left and right. Slow and fast and you should see a blueish after-image around the edges. This should happen only on some OLED mobile screens.
If anyone ever encounters this. The result I mentioned is an after-image effect from the OLED screen on the iPhone X. I haven't tested on other OLED devices, but I assume depending on the software it is possible other models can experience this. The levels of Black are incredible, but when you have a high contrast situation between light and dark, an after-image is created around the edges of the contrast zone.
How to fix this?
Simply do not use full black backgrounds or elements. Near black colors in a game situation is indistinguishable from a true black, 0, 0, 0 RGB, choice. This might be a common game design principle I am unaware of and I am the only person stupid enough to use 0,0,0 in the first place, but anyway, I hope if someone has the same issue to read this and fix it easily,

OpenCV Colour Detection Error

I am writing a script on the raspberry pi to detect the majority colour featured in a frame of a webcam and I seem to be having an issue. The following image is me holding up my phone with a blank red image on it. I seem to be getting an orange colour instead.
Now when I angle the phone I do in fact produce the red colour expected.
I am not sure why this is the case.
I am using a logitech c920 webcam that emits a blue light when activated and also have the monitor going. I am wondering whether the light from these two are causing this issue and when I angle it, these lights are not hitting it front on and thus not distributing the image.
I am still not heavily experienced in this area so I would enjoy hearing explanations and possible work arounds for my problem.
Thanks
There are a few things that can mess this up:
As you already mention, the light from the monitor and the camera.
The iPhone screen is a display, so flicker and sync might also be coming to play.
Reflection from the iPhone screen.
If your camera has automatic control for exposure and color balance etc., the picture quality can change as you move around.
I suggest using a colored piece of non-glossy paper so that you can remove the iPhone display's effects.

How to prevent pixel bleeding from rendering sprite-sheet generated with Zwoptex on older iOS device?

I packed up several individual sprites and generated a big sprite-sheet 2048*2048 in size with Zwoptex. But I scale down to match each iOS device such as 2048*2048 for iPad HD, 512*512 for iPhone, etc.
I found out that "Spacing Pixel" option in Zwoptex will effect the result of sprites rendering on device. That value means a space (in pixel) between each individual sprite packing up inside sprite-sheet. For instance, if I set that value too low then there's more chance that pixel bleeding will occur on newer or better device as well as older device. But if I increase that value, the chance lowers and for certain value that is high enough, pixel bleeding (hopefully) won't happen.
Anyway, I set value to around 17-20 which is really high and it consumes valuable space on sprite-sheet. The result is, on iPhone simulator, there's still a problem.
As we can only restricts some devices from install the game for certain iOS version, but iPhone 3GS can still update to newest version thus I need to solve this problem.
So I want to know the solution on how to prevent pixel bleeding problem to occur across all iOS devices ranging from iPhone to iPad (included retina).
It would be great to know any best practice or practical solution on selecting certain value for "Spacing Pixel" between sprites to remove the problem away when rendering.
If only the Simulator shows those artifacts, then by all means ignore them! None of your users will ever run your app in the Simulator, will they? The Simulator isn't perfect.
A spacing of 2 pixels around each texture atlas sprite frame is enough (and generally recommended) to kill all artifacts caused by pixel bleeding. If you still see artifacts, they're not a direct cause from too little spacing. They can't be.
I'm not sure about Zwoptex, do you actually have to manually create each scaled-down version of the texture atlas? You may be doing something wron there. Try TexturePacker, I wouldn't be surprised if the artifacts go away just like that.
For example, one type of artifact is caused by not placing objects at integer positions. You may see a gap (usually a black line) between two objects if their position is something like (1.23545, 10.0) and (41.23545, 10.0). Using integer coordinates (1,10) and (41,10) would fix the issues. The difficulty is that this goes all the way up the hierarchy, if these object's parent node is also on a non-integer position you can still experience this line gap artifact.
If you search around you'll find numerous cause and effect discussions for cocos2d artifacts. One thing to keep in mind: do not use the CC_FIX_ARTIFACTS_BY_STRECHING_TEXEL macro. It's not a fix, it doesn't even come close. It kinda fixes the non-integer position artifact and introduces another (much worse IMHO): aliasing/flickering during movement.

Titanium create2DMatrix ugly transformation result

Hi I quite finished my third app with Titanium. This time is not for a customer but for my self, and I did a very smart ux and ui. I loved titanium but I met only a big limit thet I wish you can help me to solve. I used code like this
myPicImageView.transform = Ti.UI.create2DMatrix().rotate(-3);
To rotate some pic, but I have a very ugly result with corner like little stairs (sorry for my english but I can't explain better) is like low low low resolution when I rotate something. There's a way to avoid this problem?
rotating any view a non-multiple of pi / 2 while cause jagged edges as UIKit does not anti alias views when rendering. for images there is a simple solution, use an image with a single pixel width of transparency around the edge, then the rotated image view will have anti aliased edges.
see this post for more details.

Image partly off screen killing as3 frame rate on IOS

I'm developing a game in as3 for iPhone, and I've gotten it running reasonably well (consistanty 24fps on iPhone 3G), but I've noticed that when the "character" goes partly off the screen, the frame rate drops to 10-12fps. Does anyone know why this is and what I can do to remedy it?
Update - Been through the code pretty thoroughly, even made a new project just to test animations. Started a image offscreen and moved it across the screen and back off. Any time the image is offscreen, even partially, the frame rates are terrible. Once the image is fully on the screen, things pick back up to a solid 24fps. I'm using cacheAsBitmap, I've tried masking the stage, I've tried placing the image in a movieclip and using scrollRect. I would keep objects from going off the screen, except that the nature of the game I'm working on has objects dropping from the top down (yes, I'm using object pooling. No, I'm not scaling anything. Striclt x,y translations). And yes, I realize that Obj-C is probably the best answer, but I'd really like to avoid that if I can. AS3 is so much nicer to write in
Try and take a look at the 'blitmasking' technique: http://www.greensock.com/blitmask
From Doyle himself:
A BlitMask is basically a rectangular Sprite that acts as a high-performance mask for a DisplayObject by caching a bitmap version of it and blitting only the pixels that should be visible at any given time, although its bitmapMode can be turned off to restore interactivity in the DisplayObject whenever you want. When scrolling very large images or text blocks, BlitMask can greatly improve performance, especially on mobile devices that have weaker processorst

Resources