添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接

I am new to Elastic.
I have setup ELK in Ubuntu machine.
I tried to upload a log file and a CSV file through the Kibana UI upload file option. But in both cases getting "File structure cannot be determined" error.
I have tried to override settings to take 10 lines only, still getting same issue.
Please help.

Error snapshot
Kibana upload error 1588×271 40.5 KB

My CSV file looks like this - just a simple CSV file
CSV file 1011×244 63.8 KB

My log file looks like this
Log file 1909×232 216 KB

  • What version of Kibana are you on? - 7.17.6
  • How large are your CSV and log files? - Log file is 4.76 MB and CSV file is 2.38 MB
  • Do you see any relevant errors in the Kibana server logs (likely located at /var/log/kibana )? - Okay I will check this.
    Update - Checked logs, no error stack trace found.
  • I checked with different CSV file, it seems if the file has timestamp and it is matching with the listed time format then it is getting imported.
    However all of my log files are throwing the above mentioned error even if they are having timestamp and time format is matching the listed options.

    Hi @cadrija Welcome to the community!

    Apologies you are having issues Can you provide a couple lines of the raw CSV in text (not screen shot)?

    Raw lines exactly how they are in the CSV you can anonymize anything you need to we just want to try with your data.

    I suspect there is still a format issue, and you can change the format / data type under the advanced setting give us a couple lines and we may be able to help.

    Hello @stephenb thank you!

    Sample logs 1

    2022-09-21T04:39:42,473 [main] ERROR: Test log: Inside LoanIQLoggingManager
    2022-09-21T04:39:44,685 [main] INFO: Started loading datatables
    2022-09-21T04:39:44,701 [main] INFO: Found resource DataTable.Deal.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.Deal.xml
    2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.abstractsgchangetransaction.xml, value:{DataTable.AbstractSGChangeTransaction.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AbstractSGChangeTransaction.xml}
    2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.abstractschedule.xml, value:{DataTable.AbstractSchedule.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AbstractSchedule.xml}
    2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.abstractscheduleitem.xml, value:{DataTable.AbstractScheduleItem.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AbstractScheduleItem.xml}
    2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.abstracttemplatefield.xml, value:{DataTable.AbstractTemplateField.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AbstractTemplateField.xml}
    2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.accrualcyclepayment.xml, value:{DataTable.AccrualCyclePayment.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AccrualCyclePayment.xml}
    2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.accruallineitem.xml, value:{DataTable.AccrualLineItem.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AccrualLineItem.xml}
    2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.accrualmatchfundedcostoffundspayableaggregation.xml, value:{DataTable.AccrualMatchFundedCostOfFundsPayableAggregation.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AccrualMatchFundedCostOfFundsPayableAggregation.xml}
    

    Sample logs 2

    2022-09-01 09:08:05.369  INFO 119618 --- [           main] c.c.c.ConfigServicePropertySourceLocator : Fetching config from server at : http://localhost:8887
    2022-09-01 09:08:05.966  INFO 119618 --- [           main] c.c.c.ConfigServicePropertySourceLocator : Located environment: name=userManagement-service, profiles=[development], label=null, version=null, state=null
    2022-09-01 09:08:05.968  INFO 119618 --- [           main] b.c.PropertySourceBootstrapConfiguration : Located property source: [BootstrapPropertySource {name='bootstrapProperties-file:./userManagement-service.yaml'}, BootstrapPropertySource {name='bootstrapProperties-classpath:/config/userManagement-service.yaml'}]
    2022-09-01 09:08:05.975  INFO 119618 --- [           main] c.u.microservice.LaunchUserApplication   : The following profiles are active: development
    2022-09-01 09:08:08.025  INFO 119618 --- [           main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data JPA repositories in DEFERRED mode.
    2022-09-01 09:08:08.637  INFO 119618 --- [           main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 597ms. Found 34 JPA repository interfaces.
    2022-09-01 09:08:08.818  WARN 119618 --- [           main] o.s.boot.actuate.endpoint.EndpointId     : Endpoint ID 'service-registry' contains invalid characters, please migrate to a valid format.
    2022-09-01 09:08:08.842  WARN 119618 --- [           main] o.s.boot.actuate.endpoint.EndpointId     : Endpoint ID 'hystrix.stream' contains invalid characters, please migrate to a valid format.
    2022-09-01 09:08:09.138  INFO 119618 --- [           main] o.s.cloud.context.scope.GenericScope     : BeanFactory id=267123e5-afdb-3749-aa42-6d221c5a80ab
    2022-09-01 09:08:09.730  INFO 119618 --- [           main] trationDelegate$BeanPostProcessorChecker : Bean 'zuulConfiguration' of type [com.usermanagement.microservice.ZuulConfiguration$$EnhancerBySpringCGLIB$$82804667] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
    

    Sample CSV

    Please let me know if this helps. I would be very grateful if any of the log files were able to get imported.
    Thanks in advance!

    Hi @cadrija

    I am using 7.17.6, Basic License

    In short the samples you provided just loaded up for me, no extra work. etc. I simply used all the default and clicked Import..

    I wonder if you have some corrupt characters, or something else in the files...These samples loaded fine for me....

    A little what I noticed

    First your Sample1 and 2 only partially meet the criteria

  • Delimited text files, such as CSV and TSV
  • Newline-delimited JSON
  • Log files with a common format for the timestamp
  • Sample 1 and Sample 2

    Have a common time format but are not CSV, TSV delimited or ndjson the the automatic parsing will be minimal in other words it will not parse "the message" portion

    Both of these imported the first time for me. ...

    Screen Shot 2022-09-28 at 7.37.49 AM1920×1285 163 KB Screen Shot 2022-09-28 at 7.39.58 AM1448×1251 155 KB

    I just Clicked Through and and the Log Was parsed and Loaded

    BUT it is not going to parse the whole log message because it is CSV, TSV or ndjson it is Unstructured Text

    You will need to build your own custom parsing to parse these logs if you want additional parsing

    The results look like

    GET discuss-sample-1/_search
            "_index" : "discuss-sample-1",
            "_type" : "_doc",
            "_id" : "xQ6JhIMB2qNNMYeXRN1I",
            "_score" : 1.0,
            "_source" : {
              "@timestamp" : "2022-09-21T04:39:42.473-07:00",
              "loglevel" : "ERROR",
              "message" : "2022-09-21T04:39:42,473 [main] ERROR: Test log: Inside LoanIQLoggingManager"
            "_index" : "discuss-sample-1",
            "_type" : "_doc",
            "_id" : "xg6JhIMB2qNNMYeXRN1I",
            "_score" : 1.0,
            "_source" : {
              "@timestamp" : "2022-09-21T04:39:44.685-07:00",
              "loglevel" : "INFO",
              "message" : "2022-09-21T04:39:44,685 [main] INFO: Started loading datatables"
          },     {
            "_index" : "discuss-sample-1",
            "_type" : "_doc",
            "_id" : "xQ6JhIMB2qNNMYeXRN1I",
            "_score" : 1.0,
            "_source" : {
              "@timestamp" : "2022-09-21T04:39:42.473-07:00",
              "loglevel" : "ERROR",
              "message" : "2022-09-21T04:39:42,473 [main] ERROR: Test log: Inside LoanIQLoggingManager"
            "_index" : "discuss-sample-1",
            "_type" : "_doc",
            "_id" : "xg6JhIMB2qNNMYeXRN1I",
            "_score" : 1.0,
            "_source" : {
              "@timestamp" : "2022-09-21T04:39:44.685-07:00",
              "loglevel" : "INFO",
              "message" : "2022-09-21T04:39:44,685 [main] INFO: Started loading datatables"
    

    In Discover

    Screen Shot 2022-09-28 at 7.46.36 AM2956×1382 500 KB

    This is Sample2 in

    Screen Shot 2022-09-28 at 7.47.49 AM2942×1498 508 KB

    Sample 3 you did not provide the actual CSV txt ... I would expect that to load as well..

    @stephenb Thanks a lot!
    I tried with partial file it is importing first 1000 lines only. Let me analyze what is wrong with the whole file.
    Could you kindly help me with another query.
    What do I need to do in order to pull this logs live from another machine?
    My ELK is on x.x.x.x host and the application log file is on y.y.y.y host.
    I am learning about the integrations but I am confused about which one to use in this case.

    What do I need to do in order to pull this logs live from another machine?
    My ELK is on x.x.x.x host and the application log file is on y.y.y.y host.

    Please open a new thread with the new question with all the details.

    You'll need to use filebeat or the elastic agent. I would suggest that you read about those. That's pretty common way to ship logs.

    If you're very new to all these concepts, I might just start with filebeat.