Don't forget to download the font Trebuchet in the same link.
August 05, 2019
Don't forget to download the font Trebuchet in the same link.
April 13, 2019
April 08, 2019
April 07, 2019
In the latest years, I have been working on Fluent Bit project making it a reliable system-level tool to solve most of the logging challenges that we face nowadays, and with a strong focus on cloud environments. This process has been a joint effort with the community of individuals and companies who are deploying it in production.
The whole point of Logging is to perform Data Analysis, so whatever makes it reliable, easier and flexible is a good addition to have; as a project maintainer I am always looking for innovation and the Stream Processing topic have got a lot of attention in my circle of colleagues and community in general.
Stream Processing (aka SP) can be described as the ability to perform data processing while it\'s still in motion. Most of the people who are familiar with the SP term knows about Apache Spark, Apache Flink and Kafka Streams within others. Most of this tooling provides a full set of data processing capabilities and helps to perform a flexible Data Analysis once the data is fully aggregated.
I mentioned above that Stream Processing happens once the data is aggregated, this means that different services are sending data from multiple local/remote sources and aggregating them in a central place so data processing and analysis can be performed. But what about if we could do distributed stream processing on the edge side? this would be very beneficial since we could catch exceptions or trigger alerts based on specific data processing results as soon as they happen.
To implement Stream Processing on the Edge we need the proper tooling that at least must have the following features:
- Ability to collect, parse, filter and deliver data to remote hosts.
- Lightweight: low memory and CPU footprint.
- Provide a Query language to perform computation on top of streams of data.
- Be Open Source (of course right ? )
Fluent Bit is a good fit since it\'s nature is data collection, processing, and data delivery, it\'s a good option to extend it with Stream Processing capabilities, and that\'s something that at Arm and Treasure Data we have been working in the last weeks (despite the idea was born on 2018).
Our current implementation will be showcased in the upcoming Fluent Bit v1.1.0 release on April 2019. It brings a Stream Processor Engine with SQL support to query records, run aggregation functions doing windowing and optional grouping. In addition, it also allows the creation of new streams of data using query results that can be tagged and routed as normal records of the Fluent Bit pipeline, e.g:
In the next part, I will be sharing details on how to get started with this new Stream Processing feature. As usual we are looking for your feedback...
February 13, 2019
Let me show some Pictures of the event:
January 23, 2019
Download Mesh Align Plus Testing Mesh Align Plus v0.5.0 - Blender 2.8
January 15, 2019
January 06, 2019
Place: Kross Bar de Bellavista
Date: Sábado 02 de Febrero
Hour: 18:00 Hrs en adelante.
November 04, 2017
I've just noticed the last post on my blog was almost a year ago!, I will try to fix that and post more often.
Fluent Bit on 2017 got a lot of traction, since people from the CloudNative-space started asking for more specific features and these were implemented, now we can see that Fluent Bit is deployed a few thousands of times each single day and having a real impact where it's solving logging pains at scale.
As a maintainer and core developer, I am very happy to see this traction from users, but also there is a community growth which honestly, without them, the project will not be rocking as it's doing today. End users around the project are an important piece which helps with contributions, troubleshooting and feedback to align roadmap in the right direction, so thank you all for your help and patience!
On 2017 as of today, we have done 27 releases where 3 of them are major releases and the other just bug fixes focusing on stability. We started the year with 0.10 major version and finalizing with 0.12 as next stable, 0.13 is just showing up in a development stage.
From a technical perspective Fluent Bit acquired the following features on this year:
- Native support for nanoseconds timestamps
- Environment variables support in configuration
- UTF-8 Support
- Full Parsing interface (JSON and Regex backends)
- Built-in HTTP Client
- Tail input performance highly improved and Multiline logs support
- Systemd Journal Input: read logs from Systemd Journal
- Syslog input: extended support for old Syslog clients.
- Disk Input: new plugin to gather metrics from local disks.
- Kubernetes Filter: enrich logs with Kubernetes metadata
- Grep Filter: match and exclude records based on regex patterns
- Record Modifier Filter modify records content
- Elasticsearch output add support for Basic Auth
- Kafka REST Output Support Apache Kafka REST backend
- Flowcounter Output new counter/stats plugin
- File Output: write records to the file system (msgpack or JSON)
- Extended Unit testing: internal routines and runtime tests
the list could be more extensive as there are many other improvements on each subsystem, all of this have been done thanks to the contribution of more than 30 people considering areas such as bug reporting, troubleshooting, code fixes and documentation within others.
Fluent Bit 0.13
This is the current development version and in addition to 0.12 the following features are already available:
- New HTTP REST Interface:
- Service information
- JSON Metrics
- Prometheus Metrics
More details about new stuff will be published at CloudNativeCon US!
April 23, 2017
December 30, 2016
Fluent Bit is a multi-platform Log Forwarder written in C
This year Fluent Bit got several new features, such as events routing, buffering, improved shared library mode and many new plugins to collect data and deliver to new destinations: in_tcp, in_forward, in_health, in_proc, out_http, out_influxdb, out_flowcounter, etc.
One of the recent features it got a lot of attention in the last CloudNativeCon, was the ability to extend it output destinations through Golang plugins, so Fluent Bit can load dynamically shared libraries created with Go, it's really neat.
Something I did not write too much about it, is the that the new version is fully running in Windows environment, the same code base works on Windows, it's portable (no Cygwin/MinGW), it can be compiled with Visual Studio without effort. This is still experimental but functional, I expect that for 0.11 release we have Windows binaries available and docs about it.
It have been exciting to see how the project have evolved and now is walking towards to be a Cloud Native Log Forwarder, there are some few missing features that are a priority for the beginning of this 2017 such as Filtering and Monitoring capabilities; they will come very soon.
At Treasure Data, we are fully committed to continue working hard to improve all aspects of logging in Cloud Environments, the recent news about Fluentd joining the Cloud Native Computing Foundation is just the beginning. You will expect many good news from both projects (Fluentd and Fluent Bit) in the incoming 2017.
Thanks to everyone involved in Fluent Bit!
December 15, 2016
November 18, 2016
It have been a while since the last post, so many good things happened. I will not dig into a full_detailed_post but some hints:
- Fluent Bit project v0.9 have been released. Now working towards 0.10 which comes with Golang support for output plugins within other neat things. Shortly it will become the default Log Forwarder :).
- Fluentd project have joined the Cloud Native Computing Foundation!
- I spoke at LinuxCon Europe (Berlin), LinuxCon NorthAmerica (Toronto) and CloudNativeCon (Seattle).
I have no updates for Monkey or Duda, but they will come shortly as from a features perspective I need to connect the dots in Fluent Bit.
May 05, 2016
Some weeks ago I attended the Embedded Linux Conference in San Diego, I've participate from the Showcase demonstrating Fluent Bit and a non-longer-secret-project that runs on top of it, more news in the incoming weeks.
As soon you power up the device through the micro-USB, you can access the serial console and start playing with it; it comes with Debian 8, Wifi (AP mode supported), 512 MB Ram, 1GHz Processor, 4GB storage...within others (and for 9 dollars!). Definitely you should consider to get one!
February 08, 2016
About a year ago, Igalia was approached by the people working on printing-related technologies in HP to see whether we could give them a hand in their ongoing effort to improve the printing experience in the web. They had been working for a while in extensions for popular web browsers that would allow users, for example, to distill a web page from cruft and ads and format its relevant contents in a way that would be pleasant to read in print. While these extensions were working fine, they were interested in exploring the possibility of adding this feature to popular browsers, so that users wouldn't need to be bothered with installing extensions to have an improved printing experience.
That's how Alex, Martin, and me spent a few months exploring the Chromium project and its printing architecture. Soon enough we found out that the Chromium developers had been working already on a feature that would allow pages to be removed from cruft and presented in a sort of reader mode, at least in mobile versions of the browser. This is achieved through a module called dom distiller, which basically has the ability to traverse the DOM tree of a web page and return a clean DOM tree with only the important contents of the page. This module is based on the algorithms and heuristics in a project called boilerpipe with some of it also coming from the now popular Readability. Our goal, then, was to integrate the DOM distiller with the modules in Chromium that take care of generating the document that is then sent to both the print preview and the printing service, as well as making this feature available in the printing UI.
After a couple of months of work and thanks to the kind code reviews of the folks at Google, we got the feature landed in Chromium's repository. For a while, though, it remained hidden behind a runtime flag, as the Chromium team needed to make sure that things would work well enough in all fronts before making it available to all users. Fast-forward to last week, when I found out by chance that the runtime flag has been flipped and the Simplify page printing option has been available in Chromium and Chrome for a while now, and it has even reached the stable releases. The reader mode feature in Chromium seems to remain hidden behind a runtime flag, I think, which is interesting considering that this was the original motivation behind the dom distiller.
As a side note, it is worth mentioning that the collaboration with HP was pretty neat and it's a good example of the ways in which Igalia can help organizations to improve the web experience of users. From the standards that define the web to the browsers that people use in their everyday life, there are plenty of areas in which work needs to be done to make the web a more pleasant place, for web developers and users alike. If your organization relies on the web to reach its users, or to enable them to make use of your technologies, chances are that there are areas in which their experience can be improved and that's one of the things we love doing.
February 04, 2016
We've opened a few positions for developers in the fields of multimedia, networking, and compilers. I could say a lot about why working in Igalia is way different to working on your average tech-company or start-up, but I think the way it's summarized in the announcements is pretty good. Have a look at them if you are curious and don't hesitate to apply!
July 10, 2015
It's summer! That means that, if you are a student, you could be one of our summer interns in Igalia this season. We have two positions available: the first related to WebKit work and the second to web development. Both positions can be filled in either of our locations in Galicia or you can work remotely from wherever you prefer (plenty of us work remotely, so you'll have to communicate with some of us via jabber and email anyway).
Have a look at the announcement in our web page for more details, and don't hesitate to contact me if you have any doubt about the internships!
April 28, 2015
A follow up to my last post. As I was writing it, someone was packaging Linux 4.0 for Debian. I fetched it from the experimental distribution today and all what was broken with the X1 Carbon now works (that is, the bluetooth keyboard, trackpad button events, and 3G/4G USB modem networking). The WEP128 authentication still doesn't work but you shouldn't be using it anyway because aircrack and so on and so on.
So there you have it, just upgrade your kernel and enjoy a functional laptop. I will still take the opportunity to public shame Lenovo for the annoying noise coming out of the speakers every once in a while. Bad Lenovo, very bad.
March 05, 2015
daniel@daniel-vaio:~$ lsb_release -d
Description: Ubuntu Vivid Vervet (development branch)
daniel@daniel-vaio:~$ systemctl status | head
Jobs: 0 queued
Failed: 0 units
Since: Wed 2015-03-04 18:55:00 CLST; 3h 8min ago
├─1 /sbin/init splash
│ │ ├─690 avahi-daemon: running [daniel-vaio.local
daniel@daniel-vaio:~$ dpkg -l upstart systemd
|/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad)
||/ Name Version Architecture Description
ii systemd 219-4ubuntu1 amd64 system and service manager
ii ubuntu-minimal 1.331 amd64 Minimal core of Ubuntu
un upstart <none> <none> (no description available)
September 10, 2014
April 16, 2013
- Install all the things jhbuild won't install for you.
$sudo apt-get install build-essential
$sudo apt-get install git
$sudo apt-get install gettext xsltproc docbook-xml docbook-xsl
$sudo apt-get install apt-file autopoint
- Install jhbuild from git (if you have previously installed from apt-get, use apt-get remove jhbuild to remove it)
$git clone git://git.gnome.org/jhbuild
$echo PATH=$PATH:~/.local/bin >> ~/.bashrc
- Prepare your first .jhbuildrc config file.
$cp examples/sample.jhbuildrc ~/.jhbuildrc
- Edit ~/.jhbuildrc file
- Search for the line starting with "prefix" and change the directory to a writable one (the default /opt/gnome is not writable by default on ubuntu it seems)
- For evince, you need to add the following lines to ~/.jhbuildrc
module_autogenargs['evince'] = autogenargs + ' --enable-debug --disable-nautilus'
- Install dependencies using the new "sysdeps" command of jhbuild
$jhbuild sysdeps --install
- Building evince
$jhbuild build evince --ignore-suggests --nodeps
$jhbuild buildone libsecret
and then resume the compilation of evince. If you want, you could do, at this stage
$jhbuild buildone evinceto trigger just the compilation of evince and NONE of its dependencies.
We use --ignore-suggests to ignore compiling a lot of modules that are not really important for developping with evince.
If everything goes right, you should have evince 3.8 compiled. If you want master, you need to choose the "gnome-suites-core-3.10" moduleset
April 09, 2013
December 27, 2012
Como muchas otras aplicaciones, ésta también necesita ser migrada a GTK3. Y como en otras migraciones, también es necesario portar de gconf a gsettings. Es el caso de gnome-blog, que necesita de tal tarea. El bug es éste.
He realizado ya los cambios principales y la prueba es éste post (que espero que aparezca ).
Los cambios hechos son:
+ paso de pygtk a pygobject.
+ paso de gconf a gsettings. (creación de su esquema y eliminación de gconf)
+ pruebas con MetaWeblog Protocol. (el resto de los protocolos hay que testearlos)
+ Drag and Drop para las imágenes.
+ Pruebas en otros equipos.
Unas pruebas… (http://avaldes.gnome.cl/?p=554)
December 03, 2012
El pasado Sábado 27 de octubre, se llevó a cabo el Día GNOME 2012 en la ciudad de Curicó, Chile.
La asistencia fue espectacular y gracias al auspicio de la GNOME Foundation se pudo seguir con la costumbre de entregar a cada uno de los asistentes un pequeño almuerzo y así poder compartir entre todos los asistentes y continuar con el espíritu del Software Libre.