Wednesday, 21 December 2011

Checking Coverity builds

I'll write another Coverity related post right after the previous one to not forget..

In Coverity you need to be very careful that what you analyze is really what you wanted to.  First off of course you need to have the correct compiler configured, missing function models defined and correct compilation flags being used for the Coverity build. These are usually a one-off job, but depending on your workflow there might be problems making sure that the new snapshot really is built exactly like the previous one. You have to remember to clean the build etc very carefully.

There are a few things to check out:

  1. Did Coverity compile everything? It says at the end of the process what percentage of units were compiled. This should be 100.
  2. Do you see an unexpectedly high number of fixed and/or new defects? This can reveal issues listed above.
  3. Are there any errors in the build log (build-log.txt in the intermediary directory)? Search for "error #" in the log. This should also show in the percentage.
  4. Are there any warnings in the build log? Search for "warning #" in the log. These can prevent Coverity from analyzing functions.
  5. Are there any asserts? This is a rare Coverity bug and also should show in the percentage as well. Search for "Assertion failed".  There are instructions on how to get past this issue in the build log. It involves setting COVERITY_SUPPRESS_ASSERT.

Coverity and Bitbake

Coverity is a useful tool but boy can it take time to get it to work, especially if your compiler is not supported.

The latest issue, however, was more interesting. Apparently Bitbake shouldn't be a problem, but it was with us. The bitbake script cleans all environment variables except those whitelisted before it runs and as it happens, Coverity requires certain variables to be kept.

This problem can be solved by editing the /lib/bb/utils.py. There is a function called preserved_envvars_list() which returns an array of whitelisted environment variables. In order to make Coverity work, you need to add the following variables to that list:

EDIT: I was told of a few more variables in the Coverity Support Forum.

        'LD_PRELOAD',
        'COVERITY_BIN',
        'COVERITY_SITE_CC',
        'COVERITY_TEMP',
        'COVERITY_IS_COMPILER',
        'COVERITY_TOP_PROCESS',
        'COVERITY_IS_COMPILER_DESCENDANT',
        'COVERITY_OUTPUT',
        'COVERITY_BUILD_INVOCATION_ID',
        'COVERITY_COMPILER_PATH_MISMATCH_FILE',
        'COVERITY_LD_PRELOAD',
        'COVERITY_LD_PRELOAD_32',
        'COVERITY_LD_PRELOAD_64'
        'COVERITY_PATHLESS_CONFIGS_FILE',
        'COVERITY_PREV_XML_CATALOG_FILES',
        'COVERITY_START_CWD',
        'COVERITY_TOP_CONFIG',
        'LD_PRELOAD_32',
        'LD_PRELOAD_64',
        'PLATFORM',
        'COVERITY_DEBUG',
        'COVERITY_EMIT',
        'COVERITY_LD_LIBRARY_PATH',
        'COVERITY_TOP_PROCESS',
        'COVERITY_USE_DOLLAR_PLATFORM'

Thursday, 10 November 2011

Like a Bus...

Soos, my Samsung Galaxy S started acting weird (half of the applications failed to start) so I installed Darky's ROM into it. This of course required a complete reinstall & configuration of everything so I threw in Titanium Backup for good measure for the next time.

There is an issue with Spotify and external SD cards. It used to be impossible to use them on a Galaxy S, then they improved Spotify so you could fix this but it required a reinstall of Spotify (& some manual file cleaning). Now it's really simply, even if you have already installed Spotify:
  • Log out of Spotify.
  • In the login screen hit the menu button & select "SD card location".
  • The location on the Galaxy S is /mnt/sdcard/external_sd.
That's it!

Eclipse fails to load & tool for Doxygen

I guess I've been lucky until now but I just had my first case of Eclipse failing to start. A quick googling found a solution. To recap:
  1. Close Eclipse.
  2. Backup & delete workspace/.metadata/.plugins/org.eclipse.core.resources.
  3. Start Eclipse.
  4. File -> Import -> Existing Projects into Workspace.
I created a rudimentary Python application to analyze quality of source commenting with Doxygen. The instructions are quite lacking and I'm sure there are lots of bugs left but maybe someone finds this useful? The idea is that the script gets two inputs: the log and index.xml files created by Doxygen. There are certain settings required for the Doxygen configuration for this to work.

The script will see which members have missing or incomplete documentation and will create an XML file that looks exactly like index.xml except for new attributes. This XML file can then be imported to e.g. Excel where one can use quick filters to calculate number of functions missing documentation etc.

DoxChecker is available from Gitorious.

P.S. My new computer (for  which I listed some test tools) is definitely much quieter than the old box thanks to a much better case (Fractal Design R3), lots of fans, a special cooler and a quietish video card. Not as quiet as my HTPC, mind, which is practially completely quiet thanks to the lack of a video card.

Thursday, 16 June 2011

Storing wishlists

I have been searching for a good way to store stuff that I want to buy in a meaningful place so that I can check them in a shop from my cell phone for a good time now. I tried some note taking applications but I didn't really find them easy enough to use.

I just came up with a solution that fits me. I have been using del.ici.us (as it was known before) to store bookmarks but I kind of stopped using it about two years when bookmark synchronization came to browsers (or rather I started using that feature). Bookmark sync isn't really useful for this sort of stuff, though, because it will just clutter your bookmarks folder.

delicious.com works better because it focused more on tag based browsing of your bookmarks so there is no problem having hundreds or even thousands of bookmarks stored there. So now I can just bookmark wishlist items and tag them accordingly (e.g. "books" and "wishlist"). In the store I'll just start up Andricious, and open bookmarks tagged "wishlist". Of course I need to do some cleaning once in a while to keep the list meaningful but then again, I'd need to do that with any system.

I like this approach because it is so flexible. Only real requirement is that the pages are named in a meaningful way (so that the name fits into the screen).

It will be interesting to see what happens to delicious now that it's moving to Yahoo. Hope they don't ruin it.

Monday, 13 June 2011

HTPC is dead, long live HTPC?

I haven't been happy with my HTPC box. UI is clunky, SW keeps crashing and the remote that came with the chassis is downright horrible. And rthere are still some issues to solve which I just can't be arsed with.

So an idea struck me on my way to work: I'll move the box next to my other PC. It has already Linux  - something I have been missing since I bought a new comp and it has plenty of processing power (iGPU is plenty enough for my use).

All I need is one of these and a DLNA server for Linux. There are plenty, I think I'll check out at least these (funny there doesn't seem to be any available from Ubuntu's repositories):

TVMobili looks like the prime candidate at the moment. Main requirement is that it can strem .ISO files since that's the format our DVDs are at the moment. And hopefully it won't have any problems with our Sony BD player.

If the free players don't work I could have a go with Wild Media Server.

Tuesday, 7 June 2011

LTE Downlink Air-Interface

I found the explanations on how the air-interface in LTE DL works a bit complicated (well, it is complicated...) especially since it took some time to figure out how data is actually mapped. As a reminder, downlink is when data is sent from eNodeB (LTE base station) to UE (user equipment). Uplink is the other way around.

Data in the downlink is sent using OFDMA (Orthogonal Frequency Division Multiple Access). The bandwidth, which there are several possible (e.g. 5MHz and 20MHz), is divided to 15kHz subcarriers. So for instance in 5MHz bandwidth there are 300 subcarriers.

Data is mapped in two domains: frequency and time. Symbols represent the frequency domain and frames represent time domain. One symbol represents all subcarriers and is created with IFFT (Inverse Fast Fourier Transformation). Each subcarrier "holds" one modulation symbol per symbol. A modulation symbol's size depends on the modulation used, there are four different used: BPSK (1 bit/sym, this is not used to transfer any data, just ACK messages), QPSK (2 bits/sym), 16QAM (4 bits/sym) and 64QAM (6 bits/symbol). Data is sent/received one symbol at a time so the more bits that can be squeezed to one modulation symbol the better (downside is that the more bits there are the better the signal quality has to be). Bandwidth obviously also affects throughput since the larger the bandwidth, the more subcarriers there are.

One UE has at least 12 subcarriers. but can have any multiple of 12. 

That's the frequency domain. It can be thought in a way as n*sc (n=number of bits per modulation symbol and sc sc = number of sub carriers) bit pattern created by running IFFT on the modulation symbols.

In time domain a frame is split into ten subframes (each taking 1 ms) which in turn are split into two slots. Each slot holds seven symbols (with normal cyclic prefix which is the norm, with extended cyclic prefix which used when distances between eNodeB and UEs are long there are six). Data allocation for UE is done by Resource Blocks which consists of Resource Elements. One RE is one slot on one subcarrier and one Resource Block is made up of 12 subcarriers and one slot (this is why subcarriers are always divided in sets of 12). So one RB has 84 RE's with normal cyclic prefix. 

The figure below tries to simplify this by starting from the modulation symbol, in this case QPSK modulated. The value of this modulation symbol is 00 in this case. This is located in the second slot in the frame's first subframe and in the last slot. After FFT (which does the opposite to IFFT) the end result is that bits 26 and 27 are represented as '00'. The figure shows one frame and 12 subcarriers, so the yellow part represents one Resource Block and each individual box is one Resource Element.

This example was for single antenna usage, with MIMO things are even more complex. In diversity mode data is just sent on several layers (LTE term for antenna streams) simultaneously which improves data quality when the conditions are poor but with spatial multiplexing data is actually split between layers and have to combined again in the receiver. This increases throughput.

This of course isn't data per se, there are still many operations required in the PHY layer (L1) to get the actual data that upper layers can use. Air interface handles codewords which are created from transport blocks (received from the MAC layer) by doing the following procedures on them:
  • CRC attachment,
  • Insertion of filler bits,
  • Code block segmentation,
  • Additional CRC attachment,
  • Channel coding (turbo coding),
  • Rate matching,
  • Code block concatenation.
I won't explain these here but these operations are quite straightforward.

After this these codewords are modulated, mapped into resource elements, IFFT is performed, cyclic prefix is added, D/A conversion is done and finally the signal is mixed to RF. And this RF signal is then received by the UE and the same operations are done in reverse.


In single antenna mode, using 64QAM in the 20MHz BW and normal cyclic prefix maximum throughput is 100.8 Mbps. It works out like this:
  • 1200 subcarriers.
  • 6 bits per modulation symbol.
  • 7 modulation symbols per slot, 2 slots per subframe (1ms).
  • => 1200 * 6 * 20 = 100.8 kbits/1ms = 100.8 Mbps.
With 4x4 spatial multiplexing absolute maximum throughput for LTE is 403.2 Mbps.

These numbers represent total throughput. Actual data throughput is smaller because there are also channels that carry control and other data (PDCCH, PCFICH, PHICH and PRACH channels - I'm not going to explain these here, maybe on some upcoming blog post) which take up resource elements, reference and synhcronization signals etc. The data itself is also coded with CRC attachments, filler bits and channel coding. Overhead depends mainly on bandwidth and channel conditions (modulation is dropped from 64QAM to 16QAM and even QPSK and channel coding uses more bits are channel conditions detoriate).

Monday, 23 May 2011

SW for analyzing, testing and benching a Windows PC

I should get my new PC today or tomorrow so I've been looking at how to test and bench the thing. I plan on overclocking at least the processor, maybe the graphics card as well so I'd like to see how those affect. I list here my current list of SW I plan to install.

General purpose

  • SiSoft Sandra. Analyses the components and runs some tests.
  • PCMark Vantage. Analyses "normal use" performance, i.e. non-gaming.
  • Cinebench 11.5. Tests both CPU and GPU by rendering images using the Cinema 4D engine. Uses OpenGL.
  • CPU-Z and HWMonitor from CPUid. CPU-Z tells detailed information about the CPU and HWMonitor shows temperature for different components.
  • Afterburner to tweak the graphic card's settings.
CPU
  • 3DMark11. The classic 3D test program. DX11.
  • 3DMark Vantage. Tests DX10 performance rather than DX11.
  • Unigine Graphics Engine. There are a number of tests based on this engine, I will run at least HWBot Unigine Engine with Extreme Presets. Sanctuary and Heaven are also possibilities.
  • Crysis Warhead is a popular benchmark. Info.
  • Metro 2033. I own this game and this is popular because it has a very demanding graphics engine. The benchmark is downloaded with a DLC.
  • Mafia II. The demo includes the benchmark.
  • Stalker: Call of Pripyat. Another popular benchmark, as the benchmark is freely available.
Mass Storage

Most of these tools can also be downloaded from 3DGuru. I'm not going to optimize the system for these, i.e. no shutting off background processes etc. because my aim is to see how much faster the system gets and how much temperatures rise rather than try to compete.

Sunday, 20 February 2011

HTPC project, part 5

OK, this DYI HTPC thing is not exactly a fun project anymore. There have been way too many problems, and still are.

First off, I had to resort to copying the entire disk as an ISO file to the HDD. I just couldn't find a way to get decent quality rips of any animations (e.g. South Park). I spent the better part of two weeks and the last attempt was one where each disk took something like 2,5 hours to rip and it was still noticeably worse than the original.

Then I run into a second problem: some disks fail with dd ("input/output error") even though they work fine normally. I'd say something like 1 in 8 or 1 in 10. This was getting quite annoying, because re-reading didn't help. But now I think I found the solution. At least the last disk that failed was now read properly. And it only takes some 15-20 minutes per DVD so it's a quick process to transfer them to the HDD.

Swiss-army-knife wasn't very useful because it means you have to keep the projector on all the time. I want to use a command-line tool.

XBMC naming conventions have also caused some confusion, because the Wiki is somewhat, shall we say, sparse. Some stuff I have found out:


When a single file contains multiple episodes:
South_Park.S01-E01-02-03-04-05-06


When an entire disk contains bonus material:
Spaced.S00-E01-02-14

Check TheTVDB for episode numbers, and use 00 as season number.

When a movie is on several disks:
Pilvilinna_joka_romahti-cd1 and Pilvilinna_joka_romahti-cd2


Special characters (äåö) do not seem to work with XBMC name matching and in any case the English titles are shown even when the original title is in some other language so "Tyttö joka leikki tulella" wasn't being added to the library.

Remote works sporadically after the boot. Sometimes it works, sometimes not. LCD display doesn't work at all. And sometimes the HTPC device's output signal is identified incorrectly (don't know whether it's my A/V receiver or the projector) and it boots up in 720p @ 60Hz. Haven't figured out any other way to fix this than to reboot.

XBMC also crashes once in a while. Maybe it's the beta-quality video drivers, don't know.

On the plus side, once the files are correctly named XBMC reads the infor from TheTVDB and TheMovieDB correctly and they show up nicely in the Library. But we also have quite a few videos that aren't on those databases, so I guess the only way is to write descriptive nfo files for each of them.

Sunday, 9 January 2011

HTPC project, part 4

I have had problems getting XBMC to work on a minimal Ubuntu installation. I was using these instructions, but I know found out that some Linux desktop is required. I opted for xfce since I'm not going to use it that much and it's light-weight. After installing xfce and tweaking the settings a bit (audio settings inside XBMC and copying the xorg.conf file from those instructions) I can view videos and listen to music, so the most important stuff seems to be working now.

I set audio to "HDMI" and 5.1 format (I'll let my A/V received to the downmix) and it seems like the motherboard connects to HDMI2 rather than HDMI1.

Viewing DVD's and Blurays directly from the disk doesn't seem to work and neither does ripping or transcoding them. Looks like I need to install some plugins for that. I also need to configure Samba again, wish I had remembered to save the config file earlier...

I'd also like to have the video player work so that when I select the directory containing the VOB's it would automatically start to play the correct one. Now I get to see all the files in the directory and have to figure out which one to choose.

Stuff to do:
  • Install swiss-army-knife to allow DVD and Blu-Ray ripping/transcoding.
  • Install driver for the LCD display.
  • Install lirc for the controller.
  • Make sure the best possible image scaling algorithm is used for SD videos.
  • Make movies work like they would from a disk, i.e. no selecting single VOB's.
  • Configure Samba.
Progress:
  • SAF is now installed and at least an initial rip of a DVD to a ISO file works. I had a few issues with this (couldn't find the correct package, initially put the rip dir to the root file system and I think it run out of space) but nothing major.
  • ISO images work well enough.
  • Samba is now configured.
But I did find a new issue: the web interface doesn't work. By all accounts it should, but even though it opens up, nothing can be done with it.