Article: Iqbal Muneeb (Customs Valuation) Data Analysis |

Article: Iqbal Muneeb (Customs Valuation) Data Analysis

[caption id="attachment_22288" align="alignleft" width="150"] Iqbal Muneeb (Customs Valuation)[/caption]

Data Analysis

What we do in our daily life from dawn to dusk? We process information. This is important for our survival and growth. Machines can help in all walks of life including information processing. What we do, machines can do it very fast; by no means better. In the present day Customs context, over the years, the volume of transactions has increased manifold, the complexity of individual transactions have also increased significantly owing to multiple tariff rates, concessionary and regulatory regimes, bi-lateral or multi-lateral agreements like FTAs, PTAs, EHs, administration of quotas of quantity and origin, diversity of specifications and incumbent classification and value related issues etc, just to name a few. Workforce, responsible for all this transaction processing could not be increased proportionately, therefore it is very important that the significance of data analysis is appreciated by Customs.

When there are so many things to look into, the probability of non-compliance, both voluntary and non-voluntary also increases. Today’s Customs administrations are dealing with this issue through increasing skill set of workforce, modernizing its systems of transaction processing through ‘business process reengineering’, institutionalizing RMS (Risk Management System) at organizational level, formalizing information and intelligence sharing with businesses, other government / regulatory authorities and other Customs administrations. Nothing is possible without ‘data analysis’. This is the blood flow which is the life line for almost all other organs.

Gradually and steadily the significance of data analysis is increasing especially at senior executive level. Understanding the nature of data - we get, we create, deal with, analyze, draw conclusions and take decisions and create further data - is important. In ordinary parlance data analysis would mean obtaining data, inspecting it to identify and remove distortions, normalizing it and applying one or more statistic or formula to convert the data into useable information which can be used in decision making. The job of an analyst does not start from data analysis per se but includes identifying channels which create distortions in data, making analyses futile and at time counterproductive. Distortions based on distortions will create more distortions in data and not address the anomalies. Officers frequently complain about distortions in the data, with minimal understanding that a significant portion of distortions have actually been introduced by processing and to some extent by monitoring officers themselves. Most common type of distortions in Customs clearance data include, but are not limited to, violations pertaining  to ‘Unit of Measure’, incomplete / incorrect declarations/ specifications , incorrect claim of concessions and exemptions, etc. One example would clarify the position. FBR notified standard ‘Unit of Measure’ (UoM)  through CGO 7/2006 dated June 26, 2006 whereby it was made mandatory to file declarations according to this CGO and the purpose was to facilitate the collection, comparison and analysis of trade statistics. Stakeholders, importers, third party brokers, and even processing officers have frequently ignored this standard without the fear of any sanctions. In spite of the fact that the clearance system was enabled to indicate correct UOM, while filing and processing quantity related declarations, other units of measure were given in the field of Item Description to hoodwink the automated clearance system. Processing and monitoring officers have all along been soft on the issue and could not prevail upon the trade to correctly indicate the standard UOM; for instance, against UOM of Kg, different units, like number, sets, gross, pieces, bundles, tonnes  would sheepishly be declared in the UOM field. Contraventions on this aspect were not given the due attention, despite legislative support under SRO 499(I)/2009 dated 13, 2009. The end result is that data pertaining to a particular classification heading cannot be compared. It has also been noted that if the system identifies such transaction for compliance check, the processing officers, frequently would make adjustments in the per unit value to arrive at certain import value for the calculation of duty and taxes without correcting the unit value according to standard UOM and without realizing that this anomaly has far reaching consequences in the clearance system on two counts; the subsequent processing officer’s limits of discretions are increased on one hand, and system’s automating such transactions is compromised. Expectations of ‘garbage in, gospel out’ are not reasonable. Selectivity criteria, however, configurable will not be able to deal with such self created rather self inflicted anomalies. Distortions can be removed through inspections, applying sorting, dispersal analysis, trim-mean, Kurtosis (a measure whether there is a problem with outliers in a data set; larger kurtosis indicates a more serious outlier problem), expert judgment, besides others.

After removal of distortions data shall be normalized. The characteristics of data elements shall be known beforehand; data is examined to evaluate whether it is congruent to desired characteristics and if not, then knowing how to deal with it, for example formats and types are corrected. After normalization, data is ready for analysis. There are numerous tools available, many are built into the system and made accessible to processing officers as MIS reports. They can at the same time, get clearance data and run their own analyses for particular jobs or requirements. For the purpose MS office’s Excel or any similar applications can be used. Mostly frequency counts, circumambulations (as in Excel Pivot Tables), means, dispersal analysis, skewness, congruence analyses, range analyses, inferential techniques (work back methods), etc besides others are used by Pakistan Customs.

Keeping in view the current perspective, FBR shall take following steps;

  1. Training senior executives on data analysis so that they become adept in helping design modern techniques of monitoring and controlling.
  2. Senior officers shall do data analysis themselves. It will help in developing insights and identify patterns of non-compliance. Repeatable analytics can be automated and shifted to machines.
  3. WCO has already released Data Model Version 3.6.0 in May last year. It is a library of data components and electronic document templates that can be used to effectively exchange data. Besides giving data element attributes, it also gives IDs and Names of UN/TDED. All present and future developments in the system should comply with the new standard. This will not only help in developing EDIs with regulatory authorities in Pakistan but with other customs administrations.
  4. There are huge gaps in trade data communication within and across borders. This synergy can be capitalized with minimal costs to authorities and stakeholders.
  5. Currently machine based clearances are a minor portion of total trade. Selectivity criteria would run on information; distortions in data would cripple its functioning. Data cleansing exercises shall be initiated and sanctioning mechanism may be put in place to avert further distortions in the data.
  6. Obvious and available resources of data shall not be ignored. Information, kept in silos, and not integrated due to turf wars amongst authorities results in poor utilization of resources. FBR can proactively deal with the situation and develop system whereby information available at different formations under its control is made accessible to appropriate users and negotiate with other regulatory and enforcement authorities, like NADRA, State Bank, Provincial revenue departments, MSA, Immigration, etc for sharing data in real time.
What is your view on this ? Let us know in the comments section