Results 1 to 3 of 3. Thread: How to test flat files? Thread Tools Show Printable Version. How to test flat files? What if there is a masked value or NULL value how do you delete it?
Question asked by visitor Kavya. Re: How to test flat files? Coming to flat file testing you need to test using substring wise. Test the data using comparision operator. Read data in an array and compare with the actual one by using concatenation or split function based on convienence.
Hope u got the answer. This can be better done using QTP. The purpose of Data Quality tests is to verify the accuracy of the data in the inbound flat files. Check for duplicate rows in the inbound flat file with the same unique key column or a unique combination of columns as per business requirement.
Sample query to identify duplicates assuming that the flat file data can be imported into a database table. Flat file standards may dictate that the values in certain columns should adhere to a values in a domain. Verify that the values in the inbound flat file conforms to reference data standards. Many data fields can contain a range of values that cannot be enumerated. However, there are reasonable constraints or rules that can be applied to detect situations where the data is clearly wrong.
Instances of fields containing values violating the validation rules defined represent a quality gap that can impact inbound flat file processing. Example: Date of birth DOB. This is defined as the DATE datatype and can assume any valid date. However, a DOB in the future, or more than years in the past are probably invalid. Also, the date of birth of the child is should not be greater than that of their parents. The goal is to identify orphan records in the child entity with a foreign key to the parent entity.
Example: Consider a file import process for a CRM application which imports contact lists for existing Accounts. ETL Validator supports defining of data quality rules in Flat File Component for automating the data quality testing without writing any database queries.
Custom rules can be defined and added to the Data Model template. Data in the inbound flat files is generally processed and loaded into a database.
In some cases the output may also be another flat file. The purpose of Data Completeness tests are to verify that all the expected data is loaded in the target from the inbound flat file. Some of the tests that can be run are : Compare and Validate counts, aggregates min, max, sum, avg and actual data between the flat file and target. Column or attribute level data profiling is an effective tool to compare source and target data without actually comparing the entire data. It is similar to comparing the checksum of your source and target data.
These tests are essential when testing large amounts of data. Some of the common data profile comparisons that can be done between the flat file and target are:. Example 1: Compare column counts with values non null values between source and target for each column based on the mapping. It is also a key requirement for data migration projects. Example: Write a source query on the flat file that matches the data in the target table after transformation.
It takes care of loading the flat file data into a table for running validations. Data in the inbound Flat File is transformed by the consuming process and loaded into the target table or file.
It is important to test the transformed data. There are two approaches for testing transformations — white box testing and black box testing. For transformation testing, this involves reviewing the transformation logic from the flat file data ingestion design document and corresponding code to come up with test cases.
The advantage with this approach is that the tests can be rerun easily on a larger data set. The disadvantage of this approach is that the tester has to reimplement the transformation logic.
Example: In a financial company, the interest earned on the savings account is dependent the daily balance in the account for the month. The daily balance for the month is part of an inbound CSV file for the process that computes the interest.
Review the requirement and design for calculating the interest. Implement the logic using your favourite programming language. Compare your output with data in the target table. Black-box testing is a method of software testing that examines the functionality of an application without peering into its internal structures or workings. For transformation testing, this involves reviewing the transformation logic from the mapping design document setting up the test data appropriately.
The advantage with this approach is that the transformation logic does not need to be reimplemented during the testing. The disadvantage of this approach is that the tester needs to setup test data for each transformation scenario and come up with the expected values for the transformed data manually. Review the requirement for calculating the interest. Setup test data in the flat file for various scenarios of daily account balance.
Compare the transformed data in the target table with the expected values for the test data. The goal of performance testing is to validate that the process consuming the inbound flat files is able to handle flat files with the expected data volumes and inbound arrival frequency. Example 1: The process ingesting the flat file might perform well when the data when there are only a few records in the file but perform bad when there is large number of rows.
Example 2: The flat file ingestion process may also perform bad as the data volumes increase in the target table. Integration testing of the inbound flat file ingestion process and the related applications involves the following steps:. Flat File Testing. Challenges in Flat File Testing Testing of inbound flat files presents unique challenges because the producer of the flat file is usually different organizations within an enterprise or an external vendor.
Flat File Testing Categories. File Ingestion Testing Data Type Testing Data Quality Testing Data Completeness Testing Data Transformation Testing Performance Testing When data is moved using flat files between enterprises or organizations within enterprise, it is important to perform a set of file ingestion validations on the inbound flat files before consuming the data in those files.
Size and Format of the Flat Files Although, flat files are generally delimited or fixed width, it is common to have a header and footer in these files. Some of the relevant checks are: Verify that the size of the file is within the expected range where applicable.
Verify that the header, footer and column heading rows have the expected format and have the expected location within the flat file. Perform any row count checks to cross check the data in the header with the values in the delimited data.
File Arrival, Processing and Deletion Times Files arrive periodically into a specific network folder or an ftp location before getting consumed by a process.
A file that were supposed to come yesterday was delayed. After the files gets processed, it is supposed to be moved to a specific directory where it is to be retained for a specified period of time and deleted. However, the file did not get copied over.
0コメント