
Print(json.dumps(employee, sort_keys=True, indent=4)) With the help of ‘json’ library we can generate fake data in JSON format as well, say we are writing an Integration Test for a RESTful service for POST or PUT operation.

Similarly, we can use address(), company(), country(), email(), credit_card_number(), currency. Faker library contains almost all the attributes required for generating the fake data. For example print("Name:",fake.name())
#FAKER IN PYTHON CODE#
If we want to generate say 10 fake names, we can enhance the code by simply calling the fake.name() function inside the loop for i in range(10): For generating one random name, we can use the following code from faker import Faker In this section, the article covers various examples of Faker lib.

#FAKER IN PYTHON INSTALL#
Just like all the other python packages, faker installation is exactly very similar, using pip for local installation we can use ‘pip install Faker’
#FAKER IN PYTHON HOW TO#
The article briefly explains how to work with the Faker library and covers multiple examples of it. The Faker library can also be used while writing mock test cases as well. Faker can generate meaningful fake data like generating names, addresses, emails, JSON data, currency-related data also generating the data from a given data set as well. The dataset created can be used for different purposes like training a machine learning model, performing different operations, etc.Faker is a Python library used for generating fake data, fake data is mainly used for Integration Testing by creating dummy data in databases. In this article, we saw how we can use Faker, an open-source python library to generate fake data and how we can create a fake dataset containing profiles of different fake people in different languages, locations, etc. We have stored these profiles into a data frame so that we can perform operations on it, like Visualization, Analysis, etc. We can use this dataset according to our needs. The dataset we have created contains different attributes like residence, location, website, etc. exp = Faker() data = df = pd.DataFrame(data) df We will create these profiles in the Hindi language. For this, we will also use pandas to store these profiles into a data frame. Now we will use the profile function and generate a dataset that contains profiles of 100 unique people that are fake. exp.profile()įaker can also generate the random dataset. We will use the profile function to generate a fake profile of a person. Other than generating names and addresses, we can generate whole profiles for different persons that do not exist. words = exp.sentence(ext_word_list=words) We can also create sentences by using our own defined word library which contains words of our choice and the faker will generate fake sentences using those words. We can also create our own sentences using the sentence function and text function. exp = Faker() for i in range(5): print(exp.name()) Let’s generate some data in the Japanese and Hindi language. We just need to mention the language we want.

We can generate information according to different regions and localities in different languages. print('Name: ', exp.name()) print('Address: ',exp.address()) print('DOB: ',exp.date_of_birth()) Now we will use this variable to generate different attributes. Now we will explore different functions that are there in the Faker library, for this, we need to initiate the Faker function using a variable. from faker import Faker import pandas as pd b. We will explore different functions of faker so we will import faker also we will perform some operations on the dataset for which we need to import pandas. In order to explore faker we need to install it using pip install faker. The datasets generated can also be used to tune the machine learning model, validate the model, and to test the model. Faker data can also be used for learning purposes like performing different operations on different types of data types. Depending upon your need you can generate data that best fits your demand. It supports all major locations and languages which is beneficial for generating data based on locality.įaker data can be used to tune machine learning models, for stress testing a model, etc. Faker is an open-source python library that allows you to create your own dataset i.e you can generate random data with random attributes like name, age, location, etc.
