Re: Proposal: Adding json logging - Mailing list pgsql-hackers
From | David Arnold |
---|---|
Subject | Re: Proposal: Adding json logging |
Date | |
Msg-id | CAH6vsW+g4FrWnX-XXy8dhqrUjhs92qdR_wxjJPm8-J=Tfc1GUQ@mail.gmail.com Whole thread Raw |
In response to | Re: Proposal: Adding json logging (Christophe Pettus <xof@thebuild.com>) |
Responses |
Re: Proposal: Adding json logging
Re: Proposal: Adding json logging |
List | pgsql-hackers |
>More specifically, JSON logging does seem to be a solution in search of a problem. PostgreSQL's CSV logs are very easy to machine-parse, and if there are corrupt lines being emitted there, the first step should be to fix those, rather than introduce a new "this time, for sure" logging method.
>It's a matter of a few lines of code to convert CSV logs to a JSON format, if you need JSON format for something else.
>It's a matter of a few lines of code to convert CSV logs to a JSON format, if you need JSON format for something else.
In the light of the specific use case / problem for this thread to be born, what exactly would you suggest?
If it is fixing csv logs to guarantee to emit one line per event, then this is equally a solution to the problem.
- It would be preferable under the light of "minimal code change"
- It would probably break some downstream parsers already in place.
- It would not try to solve this problem: https://brandur.org/logfmt (machine AND human)
- I don't know a lot of libraries that concluded csv logging is the best way to move forward. (They could be all wrong, though)
- No off-the-shelve parser exists (you need to write code, as small as it might be, it becomes a component in your stack and therefore a SPOF)
First point is definitely a strong one. Did I miss any additional arguments?
El dom., 15 abr. 2018 a las 12:24, Christophe Pettus (<xof@thebuild.com>) escribió:
> On Apr 15, 2018, at 10:07, Christophe Pettus <xof@thebuild.com> wrote:
>
>
>> On Apr 15, 2018, at 09:51, David Arnold <dar@xoe.solutions> wrote:
>>
>> 1. Throughout this vivid discussion a good portion of support has already been manifested for the need of a more structured (machine readable) logging format. There has been no substantial objection to this need.
>
> I'm afraid I don't see that. While it's true that as a standard, CSV is relatively ill-defined, as a practical matter in PostgreSQL it is very easy to write code that parses .csv format.
More specifically, JSON logging does seem to be a solution in search of a problem. PostgreSQL's CSV logs are very easy to machine-parse, and if there are corrupt lines being emitted there, the first step should be to fix those, rather than introduce a new "this time, for sure" logging method.
It's a matter of a few lines of code to convert CSV logs to a JSON format, if you need JSON format for something else.
Remember, also, that every new logging format introduces a burden on downstream tools to support it. This is (still) an issue with JSON format plans, which had a much more compelling advantage over standard-format plans than JSON logs do over CSV.
--
-- Christophe Pettus
xof@thebuild.com
DAVID ARNOLD Gerente General | |
xoe.solutions dar@xoe.solutions +57 (315) 304 13 68 | |
Confidentiality Note: This email may contain confidential and/or private information. If you received this email in error please delete and notify sender. | |
Environmental Consideration: Please avoid printing this email on paper, unless really necessary. |
pgsql-hackers by date: