I am new to Elastic.
I have setup ELK in Ubuntu machine.
I tried to upload a log file and a CSV file through the Kibana UI upload file option. But in both cases getting "File structure cannot be determined" error.
I have tried to override settings to take 10 lines only, still getting same issue.
Please help.
My CSV file looks like this - just a simple CSV file
CSV file
1011×244 63.8 KB
My log file looks like this
Log file
1909×232 216 KB
What version of Kibana are you on? - 7.17.6
How large are your CSV and log files? - Log file is 4.76 MB and CSV file is 2.38 MB
Do you see any relevant errors in the Kibana server logs (likely located at
/var/log/kibana
)? - Okay I will check this.
Update - Checked logs, no error stack trace found.
I checked with different CSV file, it seems if the file has timestamp and it is matching with the listed time format then it is getting imported.
However all of my log files are throwing the above mentioned error even if they are having timestamp and time format is matching the listed options.
Apologies you are having issues Can you provide a couple lines of the raw CSV in text (not screen shot)?
Raw lines exactly how they are in the CSV you can anonymize anything you need to we just want to try with your data.
I suspect there is still a format issue, and you can change the format / data type under the advanced setting give us a couple lines and we may be able to help.
2022-09-01 09:08:05.369 INFO 119618 --- [ main] c.c.c.ConfigServicePropertySourceLocator : Fetching config from server at : http://localhost:8887
2022-09-01 09:08:05.966 INFO 119618 --- [ main] c.c.c.ConfigServicePropertySourceLocator : Located environment: name=userManagement-service, profiles=[development], label=null, version=null, state=null
2022-09-01 09:08:05.968 INFO 119618 --- [ main] b.c.PropertySourceBootstrapConfiguration : Located property source: [BootstrapPropertySource {name='bootstrapProperties-file:./userManagement-service.yaml'}, BootstrapPropertySource {name='bootstrapProperties-classpath:/config/userManagement-service.yaml'}]
2022-09-01 09:08:05.975 INFO 119618 --- [ main] c.u.microservice.LaunchUserApplication : The following profiles are active: development
2022-09-01 09:08:08.025 INFO 119618 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data JPA repositories in DEFERRED mode.
2022-09-01 09:08:08.637 INFO 119618 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 597ms. Found 34 JPA repository interfaces.
2022-09-01 09:08:08.818 WARN 119618 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'service-registry' contains invalid characters, please migrate to a valid format.
2022-09-01 09:08:08.842 WARN 119618 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'hystrix.stream' contains invalid characters, please migrate to a valid format.
2022-09-01 09:08:09.138 INFO 119618 --- [ main] o.s.cloud.context.scope.GenericScope : BeanFactory id=267123e5-afdb-3749-aa42-6d221c5a80ab
2022-09-01 09:08:09.730 INFO 119618 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'zuulConfiguration' of type [com.usermanagement.microservice.ZuulConfiguration$$EnhancerBySpringCGLIB$$82804667] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
Sample CSV
Please let me know if this helps. I would be very grateful if any of the log files were able to get imported.
Thanks in advance!
In short the samples you provided just loaded up for me, no extra work. etc. I simply used all the default and clicked Import..
I wonder if you have some corrupt characters, or something else in the files...These samples loaded fine for me....
A little what I noticed
First your Sample1 and 2 only partially meet the criteria
Delimited text files, such as CSV and TSV
Newline-delimited JSON
Log files with a common format for the timestamp
Sample 1 and Sample 2
Have a common time format but are not CSV, TSV delimited or ndjson the the automatic parsing will be minimal in other words it will not parse "the message" portion
Both of these imported the first time for me. ...
Screen Shot 2022-09-28 at 7.37.49 AM1920×1285 163 KBScreen Shot 2022-09-28 at 7.39.58 AM1448×1251 155 KB
I just Clicked Through and and the Log Was parsed and Loaded
BUT it is not going to parse the whole log message because it is CSV, TSV or ndjson it is Unstructured Text
You will need to build your own custom parsing to parse these logs if you want additional parsing
Screen Shot 2022-09-28 at 7.46.36 AM2956×1382 500 KB
This is Sample2 in
Screen Shot 2022-09-28 at 7.47.49 AM2942×1498 508 KB
Sample 3 you did not provide the actual CSV txt ... I would expect that to load as well..
@stephenb Thanks a lot!
I tried with partial file it is importing first 1000 lines only. Let me analyze what is wrong with the whole file.
Could you kindly help me with another query.
What do I need to do in order to pull this logs live from another machine?
My ELK is on x.x.x.x host and the application log file is on y.y.y.y host.
I am learning about the integrations but I am confused about which one to use in this case.
What do I need to do in order to pull this logs live from another machine?
My ELK is on x.x.x.x host and the application log file is on y.y.y.y host.
Please open a new thread with the new question with all the details.
You'll need to use filebeat or the elastic agent. I would suggest that you read about those. That's pretty common way to ship logs.
If you're very new to all these concepts, I might just start with filebeat.