Tuesday, March 21, 2023
No Result
View All Result
Get the latest A.I News on A.I. Pulses
  • Home
  • A.I News
  • Computer Vision
  • Machine learning
  • A.I. Startups
  • Robotics
  • Data science
  • Natural Language Processing
  • Home
  • A.I News
  • Computer Vision
  • Machine learning
  • A.I. Startups
  • Robotics
  • Data science
  • Natural Language Processing
No Result
View All Result
Get the latest A.I News on A.I. Pulses
No Result
View All Result

Constructing a LAS File Knowledge Explorer App with Streamlit | by Andy McDonald | Feb, 2023

February 4, 2023
143 7
Home Data science
Share on FacebookShare on Twitter


Exploring Log ASCII Normal recordsdata utilizing Python and Streamlit

Picture by Carlos Muza on Unsplash

LAS recordsdata are commonplace and easy methods to switch and retailer well-log and/or petrophysical knowledge throughout the oil and gasoline business. The format was developed within the late 80s and early 90s by the Canadian Effectively Logging Society as a option to standardise and organise digital log info. LAS recordsdata are basically structured ASCII recordsdata that include a number of sections with details about the effectively and knowledge from it; as such, they are often readily seen inside a typical textual content editor like Notepad or TextEdit.

Streamlit is one in all my favorite Python libraries for creating fast and simple to make use of dashboards or interactive instruments. It is usually nice if you wish to create an app the place you or the tip consumer doesn’t have to fret about code. Subsequently, inside this text, we’re going to see how we are able to use Streamlit to construct a knowledge explorer app for LAS recordsdata.

If you wish to see the total app in motion, try the brief video beneath.

Or discover the supply code on GitHub:

If you wish to see how one can work with LAS recordsdata in Python, then the next articles could also be of curiosity:

The primary a part of our app will contain importing the required libraries and modules.

These are:

After importing these libraries, we are able to add a line on the finish to set the web page width to be a full web page and alter the app’s title within the browser window.

import streamlit as stimport lasioimport pandas as pdfrom io import StringIO

# Plotly importsfrom plotly.subplots import make_subplotsimport plotly.graph_objects as goimport plotly.specific as px

st.set_page_config(structure=”huge”, page_title=’LAS Explorer v.0.1′)

To examine Streamlit is working, we are able to run the next command within the terminal:

streamlit run app.py

Which can open up a browser window with a clean Streamlit app.

Clean Streamlit app. Picture by the writer.

The primary piece of code we’re going to add to this app is a name to st.sidebar. This can create a column on the left-hand aspect of the app, and we are going to use this to retailer our navigation menu and file uploader widget.

st.sidebar.write(‘# LAS Knowledge Explorer’)st.sidebar.write(‘To start utilizing the app, load your LAS file utilizing the file add choice beneath.’)

We are able to use st.sidebar.write so as to add just a few messages and directions for the tip consumer. On this instance, we are going to maintain it comparatively easy with the app title and a message on how one can get began.

As soon as the sidebar is in place, we are able to begin implementing the file uploader piece of our code.

las_file=None

uploadedfile = st.sidebar.file_uploader(‘ ‘, sort=[‘.las’])las_file, well_data = load_data(uploadedfile)

if las_file:st.sidebar.success(‘File Uploaded Efficiently’)st.sidebar.write(f'<b>Effectively Identify</b>: {las_file.effectively.WELL.worth}’,unsafe_allow_html=True)

To do that, we have to name upon st.file_uploader. We may also limit the file varieties to only .las recordsdata. To make this extra helpful, we could need to embrace the capitalised model of the extension as effectively.

Subsequent, we are going to name upon the load knowledge operate, which we are going to come to shortly. This operate can be set as much as return las_file as a lasio las file object and well_data as dataframe containing the effectively log measurements.

Following that, we are going to examine if we’ve got a las file. Whether it is set to None then nothing will occur; nevertheless, if the file has been loaded efficiently by way of the load_data operate, then it won’t be None and due to this fact execute the code beneath it.

The code throughout the if operate basically shows a colored callout adopted by the effectively title of the las file.

Earlier than we run the Streamlit app, we have to create the load_data operate. This can enable us to learn the info and generate the lasio las file object and pandas dataframe.

@st.cachedef load_data(uploaded_file):if uploaded_file is just not None:attempt:bytes_data = uploaded_file.learn()str_io = StringIO(bytes_data.decode(‘Home windows-1252’))las_file = lasio.learn(str_io)well_data = las_file.df()well_data[‘DEPTH’] = well_data.index

besides UnicodeDecodeError as e:st.error(f”error loading log.las: {e}”)else:las_file = Nonewell_data = None

return las_file, well_data

Once we run the Streamlit LAS Knowledge Explorer app, we are going to see our sidebar on the left together with the file uploader widget.

After including the sidebar to the LAS File Knowledge Explorer Streamlit app. Picture by the writer.

We are able to then click on on Browse Information and seek for a las file.

As soon as that file has been loaded, we are going to see the inexperienced callout saying the file was loaded efficiently, adopted by the effectively title contained throughout the file.

Profitable LAS file learn with the LAS Knowledge Explorer Streamlit app. Picture by the writer.

When somebody launches the LAS Knowledge Explorer app for the primary time, it might be nice to show the app’s title and a short description of what it does.

st.title(‘LAS Knowledge Explorer – Model 0.2.0′)st.write(”’LAS Knowledge Explorer is a software designed utilizing Python and Streamlit that can assist you view and acquire an understanding of the contents of a LAS file.”’)st.write(‘n’)

Once we rerun the app, we are going to now see our residence web page. This may very well be expanded to incorporate further directions, particulars concerning the app and how one can get in contact if there’s a downside.

LAS Knowledge Explorer Streamlit App after making a Dwelling web page. Picture by the writer.

When constructing a Streamlit app, it’s good follow to separate code up into capabilities and name them on the acceptable time. This makes the code extra modular and simpler to navigate.

For our residence web page, we are going to place the above code right into a operate referred to as residence().

def residence():st.title(‘LAS Knowledge Explorer – Model 0.2.0′)st.write(”’LAS Knowledge Explorer is a software designed utilizing Python and Streamlit that can assist you view and acquire an understanding of the contents of a LAS file.”’)st.write(‘n’)

When constructing Streamlit apps, it is vitally simple to fall into the lure of constantly including sections one after the opposite leading to a protracted scrollable net web page.

One option to make Streamlit apps extra navigable is by including a navigation menu. This lets you break up content material over a number of pages.

One option to obtain that is to make use of a sequence of radio buttons, which, when toggled, will change the content material displayed on the primary a part of the app.

First, we want assign a title for our navigation part, after which we’ve got to name upon st.sidebar.radio and go in a listing of pages we would like the consumer to have the ability to navigate to.

# Sidebar Navigationst.sidebar.title(‘Navigation’)choices = st.sidebar.radio(‘Choose a web page:’, [‘Home’, ‘Header Information’, ‘Data Information’, ‘Data Visualisation’, ‘Missing Data Visualisation’])

Once we run the app, we are going to see that we now have a navigation menu represented by the radio buttons.

LAS Knowledge Explorer after including in a radio button navigation menu. Picture by the writer.

In the intervening time, should you click on on the buttons, nothing will occur.

We have to inform Streamlit what to do when a range is made.

That is achieved by creating an if/elif assertion just like the one beneath. When an choice is chosen, then a particular operate can be referred to as.

For instance, if the consumer has Dwelling chosen, then the house operate that was created earlier can be displayed.

if choices == ‘Dwelling’:residence()elif choices == ‘Header Data’:header.header(las_file)elif choices == ‘Knowledge Data’:raw_data(las_file, well_data)elif choices == ‘Knowledge Visualisation’:plot(las_file, well_data)elif choices == ‘Lacking Knowledge Visualisation’:lacking(las_file, well_data)

Let’s start implementing the opposite sections to start out displaying some content material.

Inside every las file, there’s a part on the prime that incorporates details about the effectively. This contains Effectively Identify, Nation, Operator and far more.

Instance of a LAS file header from the Volve area. Picture by the writer.

To learn this info, we are going to create a brand new operate referred to as header after which loop by way of every row throughout the header.

To forestall errors when the consumer clicks on the Header Data radio button, we have to examine if a las file object has been created throughout the loading course of. In any other case, we current the consumer with an error.

Then for every header merchandise, we are going to show the descriptive title ( merchandise.descr ), the mnemonic ( merchandise.mnemonic ) and the related worth ( merchandise.worth ).

def header(las_file):st.title(‘LAS File Header Data’)if not las_file:st.warning(‘No file has been uploaded’)else:for merchandise in las_file.effectively:st.write(f”<b>{merchandise.descr.capitalize()} ({merchandise.mnemonic}):</b> {merchandise.worth}”, unsafe_allow_html=True)

When the app is rerun, and the Header Data web page is chosen from the Navigation menu, we are going to now see the related effectively info.

Effectively log header info from a LAS file. Picture by the writer.

After the header info has been efficiently learn, we subsequent need to have a look at what effectively log measurements are contained throughout the las file.

To do that, we are going to create a easy operate referred to as raw_data which is able to:

undergo every measurement throughout the las file and write out it’s mnemonic, unit and descriptionprovide a rely of the whole variety of measurements presentcreate a statistical abstract desk for every measurement utilizing the describe technique from pandascreate a knowledge desk with all the uncooked values

This can be a lot for a single operate to do and may gain advantage from being tidied up, however for this straightforward app, we are going to maintain all of it collectively.

def raw_data(las_file, well_data):st.title(‘LAS File Knowledge Data’)if not las_file:st.warning(‘No file has been uploaded’)else:st.write(‘**Curve Data**’)for rely, curve in enumerate(las_file.curves):st.write(f” {curve.mnemonic} ({curve.unit}): {curve.descr}”, unsafe_allow_html=True)st.write(f”<b>There are a complete of: {rely+1} curves current inside this file</b>”, unsafe_allow_html=True)

st.write(‘<b>Curve Statistics</b>’, unsafe_allow_html=True)st.write(well_data.describe())st.write(‘<b>Uncooked Knowledge Values</b>’, unsafe_allow_html=True)st.dataframe(knowledge=well_data)

When the Streamlit app is rerun, we are going to see all the info regarding the effectively log measurements.

First, we’ve got the effectively measurement info and related statistics.

LAS Effectively Log Measurement info. Picture by the writer.

Adopted by the uncooked knowledge values.

LAS Effectively Log Measurement info. Picture by the writer.

As with every dataset, it may be arduous to get a deal with on what the info appears to be like like by analysing the uncooked numbers. To take issues to the following degree, we are able to use interactive plots.

These will make it simpler for the tip consumer to get a greater understanding of the info.

The next code generates a number of plots on a Streamlit web page. It’s all contained inside a single operate for ease of use inside this app. Keep in mind, every operate represents a web page throughout the LAS Knowledge Explorer app.

To save lots of having to make use of a number of pages, the code beneath will generate three expanders for 3 totally different plots: a line plot, a histogram and a scatter plot (also referred to as a cross plot inside Petrophysics).

def plot(las_file, well_data):st.title(‘LAS File Visualisation’)

if not las_file:st.warning(‘No file has been uploaded’)

else:columns = listing(well_data.columns)st.write(‘Develop one of many following to visualise your effectively knowledge.’)st.write(“””Every plot will be interacted with. To alter the scales of a plot/observe, click on on the left hand or proper hand aspect of the dimensions and alter the worth as required.”””)with st.expander(‘Log Plot’): curves = st.multiselect(‘Choose Curves To Plot’, columns)if len(curves) <= 1:st.warning(‘Please choose a minimum of 2 curves.’)else:curve_index = 1fig = make_subplots(rows=1, cols= len(curves), subplot_titles=curves, shared_yaxes=True)

for curve in curves:fig.add_trace(go.Scatter(x=well_data[curve], y=well_data[‘DEPTH’]), row=1, col=curve_index)curve_index+=1

fig.update_layout(top=1000, showlegend=False, yaxis={‘title’:’DEPTH’,’autorange’:’reversed’})fig.structure.template=’seaborn’st.plotly_chart(fig, use_container_width=True)

with st.expander(‘Histograms’):col1_h, col2_h = st.columns(2)col1_h.header(‘Choices’)

hist_curve = col1_h.selectbox(‘Choose a Curve’, columns)log_option = col1_h.radio(‘Choose Linear or Logarithmic Scale’, (‘Linear’, ‘Logarithmic’))hist_col = col1_h.color_picker(‘Choose Histogram Color’)st.write(‘Colour is’+hist_col)

if log_option == ‘Linear’:log_bool = Falseelif log_option == ‘Logarithmic’:log_bool = True

histogram = px.histogram(well_data, x=hist_curve, log_x=log_bool)histogram.update_traces(marker_color=hist_col)histogram.structure.template=’seaborn’col2_h.plotly_chart(histogram, use_container_width=True)

with st.expander(‘Crossplot’):col1, col2 = st.columns(2)col1.write(‘Choices’)

xplot_x = col1.selectbox(‘X-Axis’, columns)xplot_y = col1.selectbox(‘Y-Axis’, columns)xplot_col = col1.selectbox(‘Color By’, columns)xplot_x_log = col1.radio(‘X Axis – Linear or Logarithmic’, (‘Linear’, ‘Logarithmic’))xplot_y_log = col1.radio(‘Y Axis – Linear or Logarithmic’, (‘Linear’, ‘Logarithmic’))

if xplot_x_log == ‘Linear’:xplot_x_bool = Falseelif xplot_x_log == ‘Logarithmic’:xplot_x_bool = True

if xplot_y_log == ‘Linear’:xplot_y_bool = Falseelif xplot_y_log == ‘Logarithmic’:xplot_y_bool = True

col2.write(‘Crossplot’)

xplot = px.scatter(well_data, x=xplot_x, y=xplot_y, colour=xplot_col, log_x=xplot_x_bool, log_y=xplot_y_bool)xplot.structure.template=’seaborn’col2.plotly_chart(xplot, use_container_width=True)

As soon as the above code has been applied, we are able to see that we’ve got the LAS File Visualisation web page with three expandable containers.

Inside geoscience and petrophysics, we frequently plot knowledge on line plots — also known as log plots. The y-axis typically represents the depth alongside a wellbore and the x-axis representing the info we need to visualise. This permits us to visualise developments and patterns inside these measurements with depth simply.

Inside the Log Plot part, we are able to choose particular columns from the dataframe and show them within the interactive Plotly chart.

Effectively log plots created utilizing Plotly and displayed within the LAS Knowledge Explorer Streamlit app. Picture by the writer.

Histograms present the info distribution and permit us to include a considerable amount of knowledge inside a small and concise plot.

Inside the Histogram part, we’ve got just a few primary choices. We are able to choose a column from the dataframe to show and resolve whether or not we would like that displayed linearly or logarithmically.

Lastly, we’ve got the choice to make use of the color picker from Streamlit. This lets you select the color for the histogram and may improve your visualisation for shows and experiences.

Histogram created utilizing Plotly throughout the LAS Knowledge Explorer Streamlit app. Picture by the writer.

Scatter plots (crossplots) are generally used inside petrophysics and knowledge science to check two variables. From such a graph, we are able to perceive if there’s a relationship between the 2 variables and the way sturdy that relationship is.

Inside the Crossplot part of the Knowledge Visualisation web page, we are able to choose x and y axis variables, in addition to a 3rd variable, to color code the info.

Lastly, we are able to set the x and y axes to linear scale or logarithmic scale.

Scatter plot / crossplot created utilizing Plotly throughout the LAS Knowledge Explorer Streamlit app. Picture by the writer.

Lacking knowledge is likely one of the most typical knowledge high quality points we face when working with datasets. It may be lacking for a mess of causes starting from sensor failure to improper and potential careless knowledge administration.

When working with datasets, it’s important that lacking knowledge is recognized and the foundation trigger behind that knowledge being lacking is known. A correct understanding of why knowledge is lacking is vital to creating pragmatic options on how one can cope with the lacking knowledge, particularly as many machine studying algorithms are incapable of dealing with lacking values.

Inside Python, we might use the textual knowledge summaries offered by the pandas describe operate. While that is helpful, it typically helps to visualise lacking knowledge values on graphs. This permits us to simply determine patterns and relationships that is probably not apparent with textual content primarily based summaries.

To create interactive plots of knowledge completeness, we are able to leverage the Plotly library. The code beneath units up the Lacking Knowledge Visualisation web page throughout the LAS Knowledge Explorer app.

First, we examine if we’ve got a legitimate las file; if we do, we begin creating the web page with some explanatory textual content.

Subsequent, we give the consumer an choice to pick out all knowledge throughout the dataframe or choose particular columns. To the appropriate of this, we enable the consumer to alter the color of the bars within the charts.

Then we transfer onto to plotting the info primarily based on the consumer choice.

def lacking(las_file, well_data):st.title(‘LAS File Lacking Knowledge’)

if not las_file:st.warning(‘No file has been uploaded’)

else:st.write(“””The next plot can be utilized to determine the depth vary of every of the logging curves.To zoom in, click on and drag on one of many tracks with the left mouse button. To zoom again out double click on on the plot.”””)

data_nan = well_data.notnull().astype(‘int’)# Have to setup an empty listing for len examine to workcurves = []columns = listing(well_data.columns)columns.pop(-1) #pop off depth

col1_md, col2_md= st.columns(2)

choice = col1_md.radio(‘Choose all knowledge or customized choice’, (‘All Knowledge’, ‘Customized Choice’))fill_color_md = col2_md.color_picker(‘Choose Fill Color’, ‘#9D0000’)

if choice == ‘All Knowledge’:curves = columnselse:curves = st.multiselect(‘Choose Curves To Plot’, columns)

if len(curves) <= 1:st.warning(‘Please choose a minimum of 2 curves.’)else:curve_index = 1fig = make_subplots(rows=1, cols= len(curves), subplot_titles=curves, shared_yaxes=True, horizontal_spacing=0.02)

for curve in curves:fig.add_trace(go.Scatter(x=data_nan[curve], y=well_data[‘DEPTH’], fill=’tozerox’,line=dict(width=0), fillcolor=fill_color_md), row=1, col=curve_index)fig.update_xaxes(vary=[0, 1], seen=False)fig.update_xaxes(vary=[0, 1], seen=False)curve_index+=1

fig.update_layout(top=700, showlegend=False, yaxis={‘title’:’DEPTH’,’autorange’:’reversed’})# rotate all of the subtitles of 90 degreesfor annotation in fig[‘layout’][‘annotations’]: annotation[‘textangle’]=-90fig.structure.template=’seaborn’st.plotly_chart(fig, use_container_width=True)

Once we go to this web page of the LAS Knowledge Explorer, we’re offered with an interactive Plotly chart, as seen beneath. if the consumer has chosen “All Knowledge”, then all of the columns will present.

Utilizing a Streamlit to show all columns from a pandas dataframe in a plotly chart. Picture by the writer.

If a consumer has chosen “Customized Choice” then they’ll choose the columns instantly from the dataframe.

Utilizing a Streamlit multiselect to pick out columns from a dataframe and show them in a plotly chart. Picture by the writer.

Try my article beneath if you wish to see different methods of figuring out lacking values utilizing Python:

Inside this text, we’ve got seen how one can construct an app utilizing Streamlit and Python for exploring LAS recordsdata. While this can be a primary app, it might present a helpful different to taking a look at uncooked LAS recordsdata inside a textual content editor. Extra performance may very well be added to edit the recordsdata or convert them to a different commonplace format. The chances are countless!

The info used inside this tutorial is a subset of the Volve Dataset that Equinor launched in 2018. Full particulars of the dataset, together with the licence, will be discovered on the hyperlink beneath.

The Volve knowledge license is predicated on CC BY 4.0 license. Full particulars of the license settlement will be discovered right here:



Source link

Tags: AndyAppBuildingDataExplorerFebFilelasMcDonaldStreamlit
Next Post

Comet Broadcasts Convergence 2023, the Main Convention to Discover the New Frontiers of Machine Studying

3D Printed Robotics - Synthetic Intelligence +

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent News

Modernización, un impulsor del cambio y la innovación en las empresas

March 21, 2023

How pure language processing transformers can present BERT-based sentiment classification on March Insanity

March 21, 2023

Google simply launched Bard, its reply to ChatGPT—and it needs you to make it higher

March 21, 2023

Automated Machine Studying with Python: A Comparability of Completely different Approaches

March 21, 2023

Why Blockchain Is The Lacking Piece To IoT Safety Puzzle

March 21, 2023

Dataquest : How Does ChatGPT Work?

March 21, 2023

Categories

  • A.I News
  • A.I. Startups
  • Computer Vision
  • Data science
  • Machine learning
  • Natural Language Processing
  • Robotics
A.I. Pulses

Get The Latest A.I. News on A.I.Pulses.com.
Machine learning, Computer Vision, A.I. Startups, Robotics News and more.

Categories

  • A.I News
  • A.I. Startups
  • Computer Vision
  • Data science
  • Machine learning
  • Natural Language Processing
  • Robotics
No Result
View All Result

Recent News

  • Modernización, un impulsor del cambio y la innovación en las empresas
  • How pure language processing transformers can present BERT-based sentiment classification on March Insanity
  • Google simply launched Bard, its reply to ChatGPT—and it needs you to make it higher
  • Home
  • DMCA
  • Disclaimer
  • Cookie Privacy Policy
  • Privacy Policy
  • Terms and Conditions
  • Contact us

Copyright © 2022 A.I. Pulses.
A.I. Pulses is not responsible for the content of external sites.

No Result
View All Result
  • Home
  • A.I News
  • Computer Vision
  • Machine learning
  • A.I. Startups
  • Robotics
  • Data science
  • Natural Language Processing

Copyright © 2022 A.I. Pulses.
A.I. Pulses is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In