Subscribe: Ubuntu Diaries
Added By: Feedage Forager Feedage Grade B rated
Language: English
code  evolution  google tasks  google  gstreamer  jokosher  libgdata  lot  support  tasks  telepathy  things  time  ubuntu 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Ubuntu Diaries

My Life Is A House

My daily life with all things Libre and Open

Last Build Date: Tue, 06 Mar 2018 00:00:53 +0000


Final report - Google Tasks API support to libgdata/Evolution - there's trees behind forest

Mon, 23 Sep 2013 19:12:00 +0000

So here comes last final summary post about my this year's project. This year I took a break from working with Gstreamer related things like Jokosher or libwaveform. It started quite innocent - I really wanted to add Google Tasks support to Evolution. Yeah, I know, many now assume that NSA reads all our stuff (be serious - it isn't), but really, my task list for all things open source and free software or that I have to buy a kitchen sink really not top secret information. Still, it would be nice to add tasks support for let's say ownCloud in the future too. Another huge reason for this was that I wanted to improve my C knowledge a lot, and for this reason alone these were four months good spent. Also thanks to my mentor Philip I learned one thing or two about writing much cleaner code, thanks to his regular pokes at reviewing commits I got some of these rules in my blood. It's worth to remember that every language and even every project in same language could have different writing styles.

First look at the libgdata gave false sense of assurance, but in fact I stumbled on a lot of issues I have never thought of before. First we had to choose how to implement general JSON support in libgdata. There's lot of places which would require a lot of code refactoring if we would do it full intrusion style. Instead, we choose to "sneak in" support, sometimes using clever hacks in places, as functionality of code was the same and adding something would mean duplicate a lot of stuff. When my last patches will be comitted to master I'll probably write in-depth summary about some of these hacks, in case someone will want to join in improving JSON support in future. I must admit that when first confusion weared off, libgdata code seems very nicely written and it was quite easy to test my code when I got my test right. One of more annoying problems I had was with Google authorization, as we looked for ways how to get authorizer created, as it kept coming back with errors. Google Tasks and other newer APIs quite insist on using OAuth 2.0. In the end I used GDataGoaAuthorizer in test code (which involves manually place client id in code), but it's not a real solution and I will try to get it working with OAuth 1.0 so we can get Google Tasks support tested automatically.

As Google moves to next generation of protocols which uses JSON, I hope my work on this will allow to add support for them in libgdata more easily. I have my branch published at GitHub here and you can follow GNOME Bugzilla bug #657539 who has all my patches attached.

If honest, I hadn't researched Evolution part of my project to full extent before I started hack on eds and "friends", and in result I was a little bit afraid of complexity when discovering that adding GT support will be more challenging than I thought. It took lot of time for me to fully understand how things works in Evolution and EDS, but slowly I pulled picture together. As I mentioned in my previous blog post, all Google ESource objects resides in modules/google-backend. I added tasklist to EGoogleBackend collection object, and defined new backend 'gtasks', in which implementation complexity hides. Evolution relies on this backend implementing several virtual functions. Implementing them properly would end with Evolution automatically recognizing ESource, and reading objects from it to show in task list.
Current state of my Evolution and EDS fixes is not fully ready for "screenshot prime time", however it is positive I will finish implement required changes till the end of September. Currently I finally got my jhbuild working and fixed bugs for last days like crazy. You can see final (for GSoC anyway) state of my evolution-data-server branch here.

Report #4 - Google Tasks API support to libgdata/Evolution - hacking Evolution Data Server

Thu, 19 Sep 2013 22:19:00 +0000

As Google Summer Of Code 2013 finish line closing in I'm completing evolution-data-server part of my project, and cleaning up second part of libgdata changes. Evolution part of project turned out more challenging than I expected, as it has many "moving parts", which I have to take into account.
First of all, Evolution-data-server has central daemon, but it also has of factory registry and factories for addressbook and calendar. In Evolution all communcation between elements are done via D-Bus. To this factory registry we should use ESource to add another resource of information. All related Esources can be stored together in ECollectionBackend subclass. In my case all Google sources can be found in EGoogleBackend (which in turn is hidden in modules/google-backend directory), which I have extended to support tasklists already. However, it's not whole story. Essentially, ESource is just a shell. You add extensions to it to ensure different functionality. Also each ESource has backend, who does all heavy lifting - actual data retrieval and synchronization with remote source.
As in many systems Evolution included tasks are seen as part of calendaring system. In result all calendaring backends include support (if possible) for calendars and tasks at once (notes and journals also can be added) and can be found in calendar/backends. This time though I'm making brand new backend just for Google Tasks, as it has completely different way to access Google APIs and retrieve results. I don't have to worry that calendar may try to use this backend, because in EGoogleBackend you can identify in code that only tasklists should use 'gtasks'.
Currently I'm trying to get 'gtasks' working with OAuth2 authorization. For this I follow how Contacts 'google' backend is implemented. I would like to use GOA extention for source (which EDS has), but as it means changing whole Google source collection backend, I will have to stick with OAuth2 for now.
Additionally to get backend working for read only mode, I have to implement open, get_object, get_object_list, start_view virtual classes. Their functionality is rather simple and I have bunch of them already implemented. As starting platform I used 'caldav' backend, but this caused some issues with understanding fully how everything works (thanks my mentor Milan for clearing up most of things). A little bit more complicated is synchronization between local storage and actual server, but it's still doable.
During weekend I will write one or two final entries for GSoC with details how everything works together so far. Then I hope to hand in my branches to mentors for review and commit. Then all work will continue on GNOME git server after GSoC.

Report #3 - Google Tasks API support to libgdata/Evolution - compiling, testing and believing

Sun, 01 Sep 2013 15:44:00 +0000

Again after longer pause there's some status update about Google Tasks support in libgdata/Evolution. We got all August past us, and there have been some good news and surprises along the way, so it's worth to blog about it.First, in a crazy travel stunt which included traveling in the car non-stop for 18 hours to and back with only one driver, my friend Rudolfs Mazurs (and our significant ones) in torching 37C hotness in a car without conditioning, we went to first GUADEC in my life in Brno. As I had to miss first two days due of theatre performance I had to do on Friday, we arrived at Saturday's evening. After regrouping at dormitories and arriving late at conference place Faculty of Information Technology at the Brno University of Technology, we met Gstreamer's crowd - Transmageddon hacker and unofficial leader of sorts Christian Schaller, my past year's mentor and current de facto maintainer Sebastian Dröge and other Gstreamer core guys with addition of Zeeshan Ali of Boxes and Maps fame, and went for hunting food. For driving 18 hours without sleep it was nice ending of the day, with chats going around about beautifully geeky stuff like Doctor Who.Last day consisted of lot of interesting speeches, I missed Sunday's keynote, but got essence of it later from buzz and back at home watching it in recording. I attended such good presentations like provocative and excellent Stef Walter's "More secure with less "security"", or nicely expressed Jeff Fortin's "Pitiv and GES, towards 1.x", which included most precise video clip explaining migration from Gstreamer 0.10 to 1.0 and bug hunting involved ever (yeah, that was *that* bad :)).In the middle of all this I also attended Evolution hackfest and met and had a chat with my mentors libgdata maintainer Philip Whitnall and Evolution superhacker Milan Crha. Unfortunately due of sudden changes in planning - and willing to avoid driving back in a deadly heat during daytime, so in result we decided to leave on Monday's evening, not Tuesday as planned - my visit was cut short. Overall GUADEC was a blast, and I really enjoyed it even for such short visit.I started to work on libgdata testing during GUADEC, and as usual ran into several interesting things. First of all, when Google recommends you something when using their APIs and products, you better use it, because trying other kinda "supported" things will yield inconsistent results. In this case for new JSON based APIs Google "recommends" to use OAuth2. In result with one week of trial and error I settled for using GNOME Online Accounts as way to build Authorizer (using a GDataGoaAuthorizer subclass), which I need to build working Service object (in this case GDataTasksService) so I can check if things are really working as they should. That included modifying GOA code for including Google Tasks authorization scope in get_scope method of GoaGoogleProvider, and building it with separated application keys using --with-google-client-id and --with-google-client-secret configure flags (I needed them because you have to enable tasks support for your application in Google API Console, it's not done yet for GNOME application, which is managed by Debarshi 'Rishi' Ray). In result I had my own GOA daemon, which I run instead of system's installed. I also replaced several symlinks to libgoa and libgoabackend libraries as I installed goa in /opt. Now I was ready to run the tests.Getting my own code to actual testing was kinda fun, because I learned a lot. While I'm not totally new to C and Glib, there's still lot of things for me to learn. It was most rewarding thing too, because to see it actually working after long session of debugging and compiling is very uplifting moment.While I was testing GDataTasks code Philip started to merge my core libgdata improvements (along with his own additions) to include in special branch and after 0.14 release - in master. Currently my plan for next week is to complete libgdata Google Tasks support by finishing insert/delete/update methods to GDataTas[...]

Report #2 - Google Tasks API support to libgdata/Evolution - how to make things stick together

Sun, 28 Jul 2013 18:16:00 +0000

As promised, here comes second blog post with more details on how I implementing JSON support. It is still ongoing as I improve code while adding Google Tasks support itself. As I already described, libgdata is is the GLib wrapper/glue library to Google GData protocol family, which is using XML. As they're slowly phasing it out, all newest APIs are using JSON as the communication format. To complicate things a little bit, they are not using unified query parameters (at least not in a documented way), and have changed names of these parameters. They also got rid of several key attributes which presumably were phased out because of lack of real applications for them. But first of all, let's talk how parsing of normal Google API response is done in libgdata.How to parse JSON data for libgdataAs we know, Google has enabled many services to use external APIs. As library libgdata covers most of those that can be used in desktop environment. For each service library has its Service, Query, Feed and multiple Entry (for more, look at this overview page in libgdata documentation). Service get's all action as it ensures connection and querying. Query have all the information about, well, query itself (and little bit more, see below). And Feed and Entry classes (and their sub classes) are where the work of actual parsing of information and storing it takes place.Requests themselves are done by __gdata_service_query (which is called by gdata_service_query, which in turn is called from any query function from Service), using libsoup library. My current change when receiving an answer is to look for headers - when receiving application/atom+xml follow current code path, and when receiving application/json, parse content using my implemented support. Parsing of any content happens using virtual functions, which are chained together. For example, Tasklist object received from Google has several members which are common with any other objects, and few which are unique to this object type. First, feed will be parsed (as a concept it doesn't appear in Tasks API, but list acts same way, although it's attribute set is minimized). Then when parsing list of items/entries from this feed, libgdata will first turn to GDataTaskList parse_json virtual function, which will read those members which it acknowledges, and then pass parsing to parent class if it can't find any, which in this case is GDataEntry. Feed classes are similarly chained, although it seems I won't need one for Tasks support. In the end both Feed and Entry passes the torch to their parent class Parsable, which then collects any members not parsed by upstream and call g_debug with message about unknown attribute. In the end when everything is parsed (answer into Feed, objects into Entries), object is returned to query function and after that to library user.Query or not to queryWhile Query class itself is rather simple (it it just a collection of query parameters as properties and a set/get function for each), it has important advanced feature - pagination support. For that Query stores 'next' and 'previous' tags from the XML feed. For new protocols with JSON, things got a little bit complicated, as feed returns only 'nextPageToken' member. In overall, similar to other classes, Query class is chained trough get_query_uri with it's parent, which stores general search parameters, and together they build a query which is then passed to __gdata_service query for retrieving actual data. For now 'previous' support for JSON protocols will be broken, but in the future it could be implemented by storing history of previous pages of query. I also had to disable chaining for Tasks support because as I said at the begining, uri parameters are named quite differently. Instead of that, I just get value of property from parent class GDataQuery, and build uri parameter at GDataTasksQuery level. Making it work togetherAs I wrote previously, it all comes together when user requires a specific query, for example, request[...]

Late at the party aka first report of Google Tasks support to libgdata/Evolution

Fri, 26 Jul 2013 22:33:00 +0000

This is reasonably but terribly late my first report on my project for this summer - this spring was my last in my dear uni and therefore my schedule got little bit screwed up. However now I'm back on track and crunching code as mid term review closes in.
This time I'm adding Google Tasks support to libgdata and also Evolution. As vivid Google Tasks user myself I always wanted to try my hand in this, and therefore jumped on first possibility to do it as my this years GSoC project. My mentors are superhackers Philip Withnall and Milan Crha of Evolution fame. I have used lot of their work each day (using Evolution and GMail integration for last two years, Evolution as email client for 8 years I think) and it is a little bit strange to delve into code which I have been dependent to for years. For those who don't know libgdata is that library which allows you to get all things from Google services using their API in GNOME desktop (altough libgdata is very platform neutral with small dependency set).  It is written using C and GObject, and currently uses libxml to parse XML responses from Google.
I started with understanding how Google Tasks API looks like and how it will fit in libgdata universe. Biggest change what comes with implementing it is response format change from XML to JSON. There's tons of JSON libaries for C available, however only one works well within GLib and GObject universe - JSON-GLib. By version number it feels not to be finished, however it is universally available installed by default in all major distributions. It also has what looks like complete documentation.
Main work for first two weeks was to figure out how to properly implement JSON support without breaking or changing current API much. I will have to define lot of new virtual functions for base classes, therefore ABI will be broken anyway. Google Tasks JSON API is very simple comparing to rest of newest API family, therefore adding general JSON support to libgdata I also have to check other APIs (for Calendar example), so they would fit in better when added later.
Currently I have started to read and hack Evolution - more concretely evolution-data-server - code so I can have at least visual demo for midterm review and my 4 minutes of fame in GUADEC :) I will do a little bit more detailed blog post about my changes in libgdata code base before submitting my mid term evaluation. My current work on libgdata can be found at my GitHub account

State of libwaveform after GSoC

Mon, 05 Nov 2012 11:49:00 +0000

Hi everyone, this is long waited status update of libwaveform - library, who aims to provide all necessary tools to collect, store and draw waveform data. After dealing with studies I have started to hack again. First, more about where I was at the end of the summer - I had general reading part aka WaveformReader ready, using Gstreamer level element, and also generic waveform drawing widget WaveformWidget using gathered data. However drawing just was without any zoom support. So first thing I wanted to improve and work on post summer was to provide method how to deal with zooms.

First of all I started to think about various ways how zoom level can be measured. I went wrong direct several times, but in the end I settled for same old 'pixels per second'. So for WaveformWidget object I have variables which store current, default and max/min zoom levels by this method, and methods which increases and decreases zoom level - currently by simply multiplying and dividing by two - which can be called using callbacks from clicking on buttons, for example (see demo/ to test this out). Now when I have changeable zoom level, I must decide how to draw waveform in given situation. This is what I have in mind for now.

For example, if I have waveform data reading interval 0.1 second long (or 100 000 000 ns), and I have 10 pps (pixel per second) zoom level, that means that wave will have width 1 pixel. That is not enough in a case if I have a waveform (usual one) where minimum width of wave side is 2 pixels - so I will require reading interval 0.2 seconds long from WaveformData (because of it's next reading interval upwards we calculate adding two 0.1 readings together), and we will use step of 2 pixels (because 1 sec /0.2 sec per reading is 5 readings and 10 pixels/5 readings is 2 pixels for reading).

This works nicely if I have waveform max and min width in sync with zoom levels/steps. However, I'm still struggling with scenario when you have 2.5, 3.13, e.t. pixel width if you have some odd zoom levels (for example 71 pixels/sec). There are two scenarios - either I enforce particular zoom levels, calculated "next best thing" by given numbers from user (similar like size request in Gtk+), or just warn users about not having pixel precise waveform alignment. For now I will work on second one.

Next big target is to fix Gstreamer level element to have sub-buffer intervals, otherwise for wav file instead of 100 000 000 ns interval I get 120 000 000 ns - and so on. Currently code has condition test for given interval size, but it doesn't work and actual interval gets rounded by buffers n times.

For feedback first my question is - what do you think about zoom model I offer here and how I plan to draw waveform according to it? What could be potential pitfalls (I have thought of several corner cases, but fresh view wouldn't hurt). Also I would like to know common values you use for default, min and max zoom levels.

How to get libwaveform - current development branch is 'postsummer' on It's buildable on Fedora 18 without any big dependencies (you need usual suspects from Gtk+/Glib land, also gobject-introspection-devel if not mistaken), just use --prefix=/usr with ./ so it would be installed correctly in your system. For other systems probably best is to use/install it with jhbuild. To check out recent functionality see demo/

Libwaveform final Google Summer Of Code report

Mon, 20 Aug 2012 22:55:00 +0000

Finally it's a last day of this year's Google Summer Of Code - it's when I need to calm down, look back what I have done and what's still in my todo list. As I plan to carry on with developing this library, end of summer project is merely a milestone, albeit very important one.Before this summer I had been working with Gtk+ and GObject/GLib on various projects (but mainly on Jokosher, which I try to continue to port to Gst 1.0), however never so seriously as in this project (and never using C). It took some time to get full swing of how things are done in C/GLib/GObject combo. As usual, C requires no taking things for granted, as usual successful compiling doesn't mean it works really as it should. Seriously improving my knowledge of C, GObject, Gtk+ and how to debug all this is my major gain from this project.My project aim was/is to create usable set of elements which together would allow provide a widget which would draw a waveform. Those elements are WaveformReader, WaveformData and WaveformDrawing (as I have named them in my code for now). Reader handles reading of the sound levels from files (also decoding them accordingly, all using Gstreamer 1.0), Data provides data model and method to add information to it (so far), and Drawing is a generic widget (derived from GtkDrawingArea) which goes trough data in data model and draws it using cairo library.I have done most of stuff I initially planned for Reader, however new ideas and issues popped up. I have one method to read levels, which takes source URI, interval, start and finish time as parameters, which works as expected. However while testing this I got unexpected behaviour from 'level' element - as I found out, it can't parse data for smaller interval than buffer, so if you even set interval different than default, you can't expect to get results precisely for that buffer size. This was something too hard to fix for me during project time, but I plan to do so with a help from my mentor Sebastian. I also need to expand 'level' element with having sort of 'lowest peak' value for buffer, as it helps to draw nicer waveforms. This also is on my todo list for post-summer coding.For error handling in library I choose GError. For Reader I added several errors and have tested it trough Python exception handling - you can see example in demo/ script.I had lot of ideas and theories how data model should look like, however I settled for quite simple now (see PDF here, I will add more improved version later). Most difficult part is related how to treat more detailed data which are needed for zooming in. I'm planning to return to improving data model when I will get fully working stack within theoretical application (like Pitivi or Jokosher). I have started to create TODO file, which naturally and slowly turns into design document. Currently Data has adding method which detects invervals and time and put readings in right places. For storing readings for each channel I use boxed structure (which is quite buggy I admit and will get cleanup treatement postsummer) named WaveformLevelReading. For GI annotations GList of readings is never transfered fully, as it's quite a chunk of data and copying it wouldn't be very smart resource wise. In current state of things Reader create GList of readings, and when adding it to data model, it practically becomes a part of data model, without copying.For Drawing - widget part - I first stuck at creating widget itself as an object in Python. For some strange reason manually written annotation with (transfer full) for constructor method failed and object weren't transfered, ending it's run into segfault wall in next second I was trying to access it. Thankfully, just removing annotation worked. It was strange as it was copy pasted from other object were it worked. After that I tried to follow instructions from various tutorials - ways to create custom GtkWidgets have changed co[...]

GSoC 2011 Jokosher porting to Gtk+3/PyGI - status quo

Mon, 11 Jul 2011 13:36:00 +0000

Yeah, this year again I'm working on Jokosher during Google Summer Of Code. Last summer I tried to implement Telepathy calls, thus creating nice podcast tool. While I partly succeeded and anyone willing to test it can do it using this Jokosher branch of mine on Launchpad, it . I'm planning to return to it after my porting job this summer because I want to see that feature in master branch and official release.This summer I have different, but yet still very interesting task - porting PyGTK stuff over PyGI. GI comes here as GObject Introspection. To get full details about GI read here, but in nutshell it is method to create mostly automated bindings, using special API scanners and creating Python counterparts of C methods and variable types. Therefore PyGI API follows very closely C API. This gives two very obvious advantages - there is no need of complex support of bindings, and they are always mostly up to date.For Jokosher it means that I have to replace most of PyGTK syntax which differs from Gtk+3 C API syntax. Thankfully, pygobject hackers and other coders with their porting tasks have created nice bash script called which will replace lot of the obvious stuff automatically (like 'import gtk' to 'from gi.repository import Gtk'). While it helps porting considerably, you must take into account that it is quite "dumb" and can create some nasty side effects, especially on such large app as Jokosher. So, I started with and then ran Jokosher in Gtk+3 enviroment to get errors, fix them and repeat these steps as much as needed. Mostly it was when script had done something wrong, like inserted replacement strings nested twice, but mostly it went surprisingly smooth.After that I had usual "trial and error" sessions. List of bugs I encountered doing this I have put online here - and actual code can be found here, in my project's Bazaar branch on Launchpad. Taking into account rather nasty exams I had this year I'm quite happy with what I have achieved so far. My little but effective knowledge about Gtk+ internals is helping me considerably - and again big thanks goes out to hackers at #gnome-hackers and #python channel at GimpNet. Biggest problem so far I faced with is waveform drawing. Jokosher doesn't issue an error or traceback, but it doesn't draw a thing neither. So it doesn't let me try out event manipulation either, so I could have to check out considerable size of Gtk code after fixing waveform.So for second part of this project my major aim will be to fix this waveform drawing. I also have bunch of small errors (like Unicode problems) I have yet to figure out how to fix - or they just side nuances of using jhbuild enviroment. After this I want to try out moving to PyGI for Gstreamer. This probably will be very difficult but also fun, because no one have actually tried that at home yet.[...]

In the end - final report of my GSoC

Tue, 17 Aug 2010 11:18:00 +0000

Hi everyone!It's have been great 12 weeks. While I failed to complete half of my plan (Telepathy Tubes support), I got working VoIP call recording - which was a reason why I get involved in this Google Summer of Code in first place. But before that I have little story how I got there. For just to see how it works, see this video I made during conversation with friend of mine (sorry for faulty cheap headset, which forced to cut middle section of conversation with Pitivi, as headset just died on me). You can see me running Jokosher, choosing contact, doing recording and pressing stop and getting waveform redrawn, and in the end doing playback of recorded conversation.First of all, I got old Michael Sheldon's code in which he tried to get all accounts, connect them, and then you could choose contact and call him (or her). Code has been rotten for some time in Launchpad branch, and Telepathy has changed lot of things here and there. First of all, I run old branch and saw that nothing worked. Piece after piece I gathered information how to connect to network, how to get contact handles, etc. People at Telepathy IRC channel (#telepathy at Freenode) were very helpful. Also I have to give thumbs up for Telepathy spec documentation - if you start to understand concepts, it is mostly all you will need. Also Python bindings have excelent examples.In fact, things change so fast in Telepathy world, that I had to build newer Ubuntu package for Python bindings, as Ubuntu Lucid one lacks Account Manager support. With a tip from Michael I made newer one using 'checkinstall' program, which allows fast and dirty Debian package building. If you plan to test my code in Lucid, download it from here and install it over your current version. If you want to build it yourself, get a git repository checkout, build it, execute 'checkinstall' in bindings directory, change settings (name of package, what does it provide, etc. - make sure they fit current naming/version scheme of python-telepathy package) and run it. It will build a package and install it. For other distros just be sure you have newest Telepathy and it's Python bindings installed. Additionally, for all distributions you will need Farsight Python bindings and Farsight/Telepathy glue library bindings too (named python-farsight and python-tpfarsight in Ubuntu).After completing basic connection and retrieving information about accounts, I started to work on doing actual call. At first, I simply hooked autoaudiosrc and autoaudiosink as way to hear and record for call. It was first time I stuck, because implementing a call properly within Jokosher required to understand what actually happens there. In the end, it was very important to add watch for Gstreamer pipeline bus so Telepathy/Farsight would know what happens and could act on that. After that I could do calls, but there was no recording done.For actually recording it, I had to join all Farsight pipeline stuff with Jokosher inner recording system. Jokosher records each instrument using separated bin, which are put together in one big pipeline (allowing to set states in masse). Not wanting to complicate things, I just extended current Farsight pipeline with two tees and split output and input so I could record and send my voice to remote participant in same time - and I could record and hear voice of from other side too. Recording is done by similar bins as for normal instruments, just tweaked for VoIP streams. This is where I stuck quite seriously. In the end, actual solution to my problem was quite logical - Farsight sink pad for sending voice to remote end and source pad for getting it from there - gets created only after whole pipeline is set to playing state. That means that those pads expects any other element to be set to playing before it gets linked to it. Thanks Tester from #farsight@Freenode for explaining i[...]

Very long Google SoC report #3

Mon, 12 Jul 2010 09:42:00 +0000

Hi everyone!

I was skipping my blog as I was trying to get Telepathy calls working within Jokosher. I wanted to post something when it was working actually. Well, I haven't got to that point yet - as I still trying to connect Telepathy call stuff to Jokosher - but I feel obliged to tell you more how I got there, because it was quite interesting road so far.

First of all I tried to run Michael Sheldon's code and see what works and what doesn't. In overall, I tried to use newest Telepathy API spec, so first to get somewhere I needed to build python-telepathy package from newest GIT branch. I did it with checkinstall (found in Ubuntu universe repo, thanks to Michael about tip), which is quite nice tool for fast-and-dirty package building. In source directory just ./, build it with make, then launch sudo checkinstall. Just make sure to change version to something appropiate, like 0.16 and change Package name and Provides parameters to 'python-telepathy'. Then checkinstall will create deb package and will install it into your system, so get ready to get your system trashed a little bit (for python-telepathy it isn't such a trouble).

After that I submerged in world of VoIP. Thanks to guys at #telepathy at, I understood how telepathy python bindings works and got around implementing current code. Bigger problems raised when I tried to understand how Farsight kicks in during the call, but I got it too - I was lacking binding that would made Telepathy/Farsight listen to gstreamer bus messages and then act accordingly (in fact, I very good example how to make a call from python is in Telepathy Python bindings package, examples/

For last two weeks I have tried to bind TP/F pipeline with Jokosher recording system. As most of sinks and pads in TP/F conference/session is created on the fly, it is quite challenging. It works like this: I have created gobject signal for TelepathyContact to bind eveything with Jokosher pipelines in Project.RecordVoIP. When call is successfull, this method binds every piece to record it in instruments. I have two challenges here - I have to create another instrument besides voip to record local speaker (let's call it "voip-local") and have to find a way to sync Jokosher pipelines with call ones. Also I need to decide to I allow recording happen at once or wait for remote contact to respond.

This is midterm review week and while I have had some setbacks - family stuff like sister marrying (yay) and very hot last two weeks, which makes me feel exhausted - I think I will finish my primary goal this week and will start to work on pipes support. Current code in telepathy-ng branch at Jokosher LP isn't doing anything useful, due of me improving code every day, but you can check out and test it if you like.

Google SoC report #2 - Discovering MC5 and friends

Sun, 06 Jun 2010 16:28:00 +0000

Hi everyone, this is my second report about my Google SoC project, which will bring initial Telepathy support for Jokosher. In previous week I was very busy with my qualification paper, which I finished and submitted this monday. After that I jumped in.

First of all, I joined IRC channel #jokosher at (or, if you like) to discuss with Michael Sheldon and Laszlo Pandy how to practically achieve my goals. As Telepathy stack have been developed rapidly last several years and things have changed since Michael's work on his telepathy branch, I had to rewrite method which gets accounts from Telepathy first. Previously it used gconf to store account information, but now you have to do it all using D-BUS magic. Of course, there are lot of things made easier with using python-telepathy bindings. Telepathy guys are slowly moving focus to gobject introspection and pygi way of doing things, but more or less I and my mentor agreed that I have to move forward with former. I also changed application's visual behavior so in a case if there is no accounts of supported type defined and user tries to add 'VoIP' instrument, it gives nice information message in error message area. I also doing patch outside my scope (which I plan to finish tomorrow) of the project for supporting GTK InfoBar, as currently Jokosher use custom class to form this area, to fix various theme bugs - like current one with dark fonts on dark background. I also plan to add button to open Empathy Accounts dialog, so users can add accounts straight from Jokosher.

For next week I plan to connect all dots to actually record conversation. This will be interesting challenge, as I never fully investigated gnonlin. Another problem I want to fix is to get Jokosher connecting to accounts when Empathy already is using them. Currently it requires Empathy to be closed as it tries to request new connection.

Of course, all code can be found in lp:~pecisk-gmail/jokosher/telepathy-ng. You can access code from Ubuntu using command 'bzr branch lp:~pecisk-gmail/jokosher/telepathy-ng'.

Google Summer of Code 2010 and long time no post

Fri, 14 May 2010 09:59:00 +0000

Hi here!

I have several blogs on, but for very long time I was quite busy and had time for doing several posts only in Latvian ones (Mostly Ubuntu, free software and OpenStreetMap related). Now I finally have some time and what's important, reason to do more posting here, so I hope it will become more of regular habit. Reason of course it's quite thrilling - some time ago I was confirmed as participant of this year's Google's Summer of Code, which is very great, because I love this initiative and have always wanted to be part of it. So here I will try to detail my plans for my summer of code.

In nutshell, I will try to make Jokosher a podcast killer application. When Jono Bacon and friends initialized Jokosher, main aim was to make podcasts easy and without lot of technical wizardry. For now, Jokosher have nice basic recording capabilities, with lot of good LADSPA effects and of course with excellent user interface. In previous years, we have fought to have multichannel cards supported trough using ALSA Gstreamer elements and also having a basic stability so it wouldn't crash when something really bad happens.

For my work I will use Michael Sheldon's work on Telepathy VoIp support in Jokosher, so I have to create user interface, connect all dots and of course, test it in real life, doing real podcasts with friends :) Then I have two bigger challenges, first one I have to extend current code to use Telepathy tubes so I can do lot of cool things remotely within Jokosher-Jokosher session - and then I will use this tubes support to create high def sound sync - It will work like this. For communication I will use simple VoIP call. After call will be finished, Jokosher will sync actually recorded sound with Jokosher at the other end, so in the end, you can have very high quality mix for your podcast, while doing call using low bandwidth codec.

And yes, finally this is also my "hi" to Planet Gnome!

FYI: MyBooks Professional as Quicken replacement

Sat, 06 Dec 2008 11:41:00 +0000

If you wanna migrate to Linux or mixed enviroment and looking for some serious replacement of Quicken or another personal or SMB finance management package, then check out MyBooks. Got this link to it from Slashdot disscussion about IBM offering Linux workstation business package. Site looks a little bit odd (Coming from 2001 or so), but it's not always indicate quality and/or usefulness of application. There is trial version available, so check it out.

Evolution which sounds like revolution

Fri, 28 Nov 2008 11:58:00 +0000

There are several significant developments within Evolution community which interests me most:

1. First, Evolution team has promised to provide Windows binaries for each Evolution release. The last one they did is version 2.24.2 (which is quite recent). It is very important news, because so far Windows has been left in the cold, if we talk about quality of open source e-mail clients. Yes, there is Thunderbird, but it's still feels very immatured to me even after reaching 2.0. Sorry, but having to put signature in special .signature file doesn't cut the mustard1.

Anyway, availability of Evolution on Windows (and later on OS X, I hope) could change landscape of groupware offerings, because finally there would viable alternative to Outlook, which is still the king in Windows. My dream is to have CalDAV + Mail system (Postfix/Exim + Davecot) as easy replacement of Exchange system. Evolution have CalDAV support for Calendars for some time already, but there comes biggest news...

2. Thanks to Milan Crha, which was man who leaded my succesful attempts to provide first "official" patch to Evolution previously this year, have nailed and finally commited patches for supporting CalDAV VTODO (known as Tasks in Evo) and VNOTES (known as Memos) in Evolution. It gives me posibility to use DaviCAL server with Evolution and fully share my plans and tasks between work and home computers, and laptop too (I patched Evo in Hardy and Intrepid to do this). Yes, there are some rough edges, and stability sometimes suffers (without loss of information), but it is done - next stage is improvement of accounts (having one account to create in Evo and connect to all goodies of CalDAV). It's already improved my planning and scheduling so much that I really own Milan a bear (or whatever his favorite party trink is :)).

If we look back - when Novell went into some finansial troubles and formed strange partnership with Microsoft (which still bothers me time after time), I feared that it means end of Evolution as we know. Surprisingly, Novell not only countinue to provide superb support for Evolution, it also has worked with all contributors to change license to GPLv3 and also eased posibility for others to fix and improve Evo as they need (Milan works for Red Hat, for example). Yes, it has bugs, but I must say this - Evolution is a Firefox of open source groupwares. There are competition, but no one have came even close to deliver that much as it does.

Big thank you to you Evo guys, you really rock!

1 no hard feelings to Mozilla and Thunderbird teams. You still do your job and Thunderbird is still more popular than Evo. Keep improvements coming and maybe one day we will have nice competition between Thunderbird + Lighting and Evo.

How to fix broken gconf or how to deal with desktop which shows up borked

Sun, 14 Oct 2007 11:32:00 +0000

It is easy to screw up your Debian/Ubuntu install with self made debs. Well, I did myself and paid a price. Everything started with GnuCash, which, with 2.2 version, has also Windows port, which I suggest all Windows users as personal finansial manager - yes, they lack some features, but if you are in Europe, it fits perfectly.
However, when I wanted to use file in Ubuntu Feisty version of GnuCash, it gave error. I found out that 2.0 and 2.2 has problems with format compatibilities. So I went and built deb package for 2.2.1 version. Unfortunately, it included lot of non-GnuCash files in package, so when I installed it, it overwrote bunch of libraries, including gconf ones.
As long this GnuCash version was on my computer, there where no problems. However, uninstalling it (I got 2.2.1 "official" version of GnuCash trough Feisty Backports) caused my desktop to be b0rked, because gconf has problems with it's libraries and it can't read xml configuration files.
So I rebooted computer, went to console with ALT+CTRL+F1, loged in and issued such command:
sudo apt-get --reinstall install gconf2 libgcon2-4 gconf2-common
And then rebooted computer with command:
After restart of computer everything should be all right. If you still have problems, that it's not caused by lack of libgconf libraries.

Wed, 30 Aug 2006 10:07:00 +0000

Shame, shame, ohhh, what a shame...

It was very long time of no blogging, however it wasn't my intention, and if even no one reads this, it is just a little shame that I have almost no time for this journal, because when I created it I wanted it to matter.

Now, however, I am motivated to start from begining and so here we go....

Ubuntu 6.06 LTS aka Dapper is released almost two months and this crucial time have been spent without big worries. Only somehow important was xserver-xorg release screwup, but I beg to differ about it's impact - mostly impacted was those who update almost every day. Be sure, I am not one of them, and lot of common users either. :) Anyway, community leaders have learnt their lesson in this and they now investigate how to deal with such situations, t.i. mostly how to test updates for stable release before releasing them in wild.

Anyway, I use Dapper on work computer for now, and I'm very happy about it's perfromance and my "common user test" so far. Certainly there is lack of functionality here and there (most of them already aviable in Edgy updates), sometimes there is gripe about some problems with Microsoft Office documents, but in overall, I think Ubuntu is right on it's way to be Linux desktop king for common users.

I'm planning to post lot of posts from now and they will be about lot of different things around Ubuntu, Linux and free software. There will be also Peter Learns articles, about system tweakage or how to do something without breaking your stable Ubuntu. There will be also my multimedia hurdle, DVDs, Gstreamer, also Jokosher progress (Those guys continue to deliver what they promise, kudos to them), Jono also wrote good intro article about Gstreamer, Python and GTK combo or how make multimedia player in 100 lines of code.

There is lot of things to hype, to be sadden, to be happy, or to just have warm feeling inside about, so stay tuned.

And ohh, certainly Ubuntu Diaries will have Latvian version too.

ALSA worries countinues

Sat, 08 Apr 2006 12:29:00 +0000

And I thought to myself, that it was too easy...Yesterday, playing again with Jokosher, I run again in two errors on both computers I use to test gstreamer-alsa/Jokosher stuff with. First of all, I got error while try to address 'hw:0' as device from which you can record from. It was bugging me for a while and I thought that it is Ubuntu or ALSA bug.So after consulting with j^ in private on IRC, I got that these are two seperate issues. First was on my work computer and it was more like this - any ALSA recording, using 'device=hw:0' was failure.pecisk@ubuntu:~$ gst-launch-0.10 alsasrc device=hw:0 ! flacenc ! filesink location=device.flacSetting pipeline to PAUSED ...Pipeline is live and does not need PREROLL ...ERROR: from element /pipeline0/alsasrc0: Could not get/set settings from/on resource.Additional debug info:gstalsasrc.c(349): set_hwparams (): /pipeline0/alsasrc0:Rate doesn't match (requested 44100Hz, get 0Hz)ERROR: pipeline doesn't want to preroll.Setting pipeline to PAUSED ...Setting pipeline to READY ...Setting pipeline to NULL ...FREEING pipeline ...pecisk@ubuntu:~$ However, j^ suggested to check out the same recording with arecord, which gave me more clues about what causes failure:pecisk@ubuntu:~$ arecord -D hw:0 -f cd test.wavRecording WAVE 'test.wav' : Signed 16 bit Little Endian, Rate 44100 Hz, StereoWarning: rate is not accurate (requested = 44100Hz, got = 48000Hz)please, try the plug plugin (-Dplug:hw:0)So issue appeared to consist of two problems - my sound card has recording locked on 48000Hz - quite unusual for built-in sound card (could be card problem or ALSA driver bug) and alsasrc (and as far is known, alsasink too) don't probe cards for rate instead of that requesting it. If cards set rate doesn't mach with requested from alsasrc/alsasink, it fails. Solutions can be several - first, probing for rate of PCM in alsasrc/alsasink would be very good, because it would benefit for various encodings/players/recorders to convert their media before playing it. This, however, would require bug gstreamer guys _or_ start to take it upon myself. Another solution would be simply probe sound card PCM directly for rate and use it within pipeline - but it feels more like workaround, not long-living solution to problem:gst-launch alsasrc device=hw:0 ! "audio/x-raw-int,channels=2,rate=48000,depth=16" ! audioconvert ! flacenc ! filesink location="test.flac"So this is for my first problem. Second one, however, appeared to be more difficult one. I have two sound cards on my home computer:pecisk@blackcatstudio:~$ cat /proc/asound/pcm00-00: VIA 8235 : VIA 8235 : playback 4 : capture 100-01: VIA 8235 : VIA 8235 : playback 1 : capture 102-00: ICE1712 multi : ICE1712 multi : playback 1 : capture 102-01: ICE1712 consumer : ICE1712 consumer : playback 1 : capture 102-02: ICE1712 consumer (DS) : ICE1712 consumer (DS) : playback 6Using System > Preferences > Sound, you can choose which card would be used then for playing sounds/movies/etc (I usually switch to my EWS88MT, which sounds just perfect for music/movies with gst-0.10, thanks to ALSA/Gstreamer guys). It changes order of soundcards (defining which card is default one) and creates .asoundrc and .asoundrc.asoundconf files within your home directory. This, however, screws up 'hw' definitions and it was no more possible to record anything using 'hw:0' even from arecord.So on my todo list is to:* Fill bug about Ubuntu/GNOME screwing sound card definitions in ALSA - should be easy to fix;* Fill bug for Gstreamer about probing rate from PCMs and using them for play/recording;* Take a look on ALSA, Gstreamer and GNOME with various cards in compute[...]

Death of the bug and other musings

Fri, 07 Apr 2006 14:19:00 +0000

Well, after little brainstorming, noise and brohaha, annoying gstreamer-alsa recording bug got fixed. Thanks to j^ and Sebastian of Ubuntu, bug fix was provided and new version rolled out yesterday. So now I can record with Jokosher and GNOME Sound Recorder on Ubuntu, using Gstreamer backend, without any problem.

Differently, nvidia binary driver got another bug fix release, which seems have to eliminate lot of lockups in my work computer. Will try Xgl later, which was unusable with such nvidia driver's behaviour, t.i. locked up after 5 mins.

Played with Jokosher - nice and easy for now, while not having any serious functionality (as it waits for it's 0.1 release). Slices and their split/join looks shiny and really gives promise of serious app. Ellio started to hack ALSA channel split (for multiple channel card owners like me :)) element for gstreamer, various bits and peaces gets added, visual bugs fixed. Ellio also works on using Gnonlin (Non-linear elements for Gstreamer, used in various video editors for now) for audio stuff.

Modern Ubuntu times

Sun, 02 Apr 2006 19:27:00 +0000

There are moments in the life when I miss all that Linux desktop begining stuff, when you had to tweak your Linux distribution at maximum to get, for example, divx playing. For now, I just have to enable universe and metaverse and install gstreamer-ffmpeg and gstreamer-plugins-bad and gstreamer-plugins-ugly (for AC3 sound) to get it work. Few clicks and I can play my divx just fine, also Quicktime movies. However, Totem plugin for Firefox still has lot of raw edges, so there is room for improvement. But I am very happy to see that Gstreamer idea proves it's worthy.

Saying that, Ubuntu Dapper beta still has bugs to squeeze, and there comes today's story about several important ones. Bunch of funny and talented guys have created project Jokosher, which aims to provide easy to use and easy to configure multitrack recorder/sequencer, using newest technologies like Gstreamer 0.10, GTK+ 2.8, Python, and Cairo (more in Cakewalk/Cubase direction. In meantime, I would like to point out, that profesionally aclaimed Ardour as Protools replacement will get gtk+2 interface very soon too). They have big fun coding and improving their baby and I say more power to them.
While testing their work, I run into problem that newest gstreamer-0.10 in Dapper won't record anything from alsasrc, which is very pitty. Seems like recording stuff using gstreamer is all broken. However, Ubuntu guys are informed about this problem, also Gstreamer mob, so I hope for fast fix.
And yeah, these days is when Dapper flies sixth time. Grab it for testing/bug reporting pleasure there. And register on Launchpad to register bugs and comment them with your expierence if there is already one.

Food fight at GNOME Planet, part II

Tue, 07 Feb 2006 12:00:00 +0000

It was quite easy to get know what is really going on here with all land shaking and screaming of lost lifes (ok, that was overblown). And after Jimmac posted his "why oh why" was so very easy to say that be such post some weeks previously, there won't be such flaming and calling in names for him and Dobey. was just about cleaning up icon theme, using new, much better looking Tango icons and dropping names inside icons. When described, it makes all sense. However, it would be nice to find a different way visually easy to distinguish different file/connection types. It maybe means nothing for common user, but for advanced ones with their motoric skils it can save minutes from their work time.
One of proposals is to use emblems - and actually I would like that. However, count of emblems are somehow limited and you can't get emblem for each file type. Then I though about overlayered text over icon - you could turn it easily on/off.

Actually, at least on Ubuntu Drapper after few hours your forget that those icons where used to contain text, because new ones looks very clear and polished. Kudos to Jimmac and Dobey and all Tango team.

Got myself a little reading about Gstreamer, Glib, GObject and Gstreamer MIDI support. Wanna get dirty with all this stuff :) But let's start with C.

First time experience

Mon, 06 Feb 2006 11:00:00 +0000

It is just my first post in this blog, for testing sake. I would write here about my bug hunting with GNOME/Ubuntu and Linux in general, my open source and free software experience in server field, and also about aviable free software tools in digital recording field.

For now, I'm just messing with Ubuntu Drakker, which has some positive and some negative strides, but in overall, Ubuntu gets better and better with each release. Have filled various bugs here and there (mostly multimedia related), get confused and saddened by legal straws and problems to get mp3 out of box support in distros (I prefer Ogg Vorbis, but I don't want left many users with lot of mp3 in the cold). Also there is a little food fight in p.g.o. about icons in buttons and texts in buttons. As it was expected, in theory taken steps sounds nice, but in reality you can't fit one glove for any hand. For example, I actually don't care about text in icons for graphical files, it would be better without them probably. However, I am just very against removing them from mount icons. And it is good that some kind of compromise starts to appear - for example, emblems a la Nautilus fame would be very nice.

In the evening I will write more.