Bookmark and Share Subscribe You may face visibility problem if http://docs.google.com is blocked on your network

Thursday, November 12, 2009

Move order Types

Move Orders can be created from different sources. These move orders are identified by Move order type at the header level. Different types of move orders are:

Move Order Requisitions: These are created from the move order form. Based on the setup at the organization and item level, move orders require approval before they can be allocated and transacted. These requisitions can be created to charge an account or move the material from one subinventory to another.

Replenishment Move Orders: These are auto-created with pre approved status. Depending on the planning type you can replenish the quantity in a subinventory for an item. Three types of replenishment plans (in relation to move orders) are available:

  1. Min-max Replenishment Planning
  2. Kanban Replenishment Planning
  3. Replenishing Count planning

Based on these sources, when appropriate programs are run, move orders are automatically created.

Pick Wave Move Order Requisitions: These move orders are specifically for the sales order picking process. When Pick Releasing program is run move orders are created as preapproved and allocated. Pick Confirm process transacts these pick wave move orders.


Tuesday, November 10, 2009

Serial Number Genealogy

Genealogy by definition:
1. The study or investigation of ancestry and family history
2. The recording of the ancestry or descendants of a certain individual/object

Serial genealogy tracks the transaction and multilevel composition history of any serial-controlled item from receipt through work in process and inventory to your customer sale. The composition genealogy is captured through material transactions in Oracle Work in Process.

The following transaction types are supported:

  • Purchase order receipt
  • WIP component issue
  • WIP assembly
  • Sales order issue
  • Miscellaneous issue
  • Miscellaneous receipt
  • Account alias issue
  • Account alias receipt

Friday, November 6, 2009

Jumbled Order-Handling

How to Handle Jumbled Order?

Question: What is jumbled ( or mangled) Order?

Answer: Jumbled Order is the order for which delivery information and order information is not in sync. The typical case we are going to discuss is as below.

Case: Say the order has 4 lines with different item and quantity 10 for each line. Line 3 splits into two lines with 7 and 3 quantities.

e.g.

Line# ----- Item ----- Quantity

1.1 ------ AAAA ---- 10

2.1 ------ BBBB ---- 10

3.1 ------ CCCC ---- 7

3.2 ------ CCCC ---- 3

4.1 ------ DDDD ---- 10

Line# 3.2 will have reference of line 3.1. SPLIT_FROM_LINE_ID column of OE_ORDER_LINES_ALL table. i.e LINE_ID of 3.1 will be populated in SPLIT_FROM_LINE_ID of line 3.2

Now shipment happens for line 1.1, 2.1, 3.1 and 4.1 later 3.2 gets shipped. For all shipments details goes to table WSH_DELIVERY_DETAILS with LINE_ID populated in column named SOURCE_LINE_ID. After completing Interface Trip Stop entries gets posted to Inventory and order line gets closed. Also gets WSH_DELIVERY_DETAILS.OE_INTERFACED_FLAG updated with ‘Y’

But in some cases it does not work as usual hence we call it as Jumbled order. Most of the time this happens for the lines which are split lines (e.g. Line 3.2 in above case), in those cases order line gets shipped properly but does not get closed. The reason behind this is, record/s for that particular split order line in WSH_DELIVERY_DETAILS , carries LINE_ID of parent line (e.g. Line 3.1 in above case) in column SOURCE_LINE_ID also line number (i.e. 3.1) gets populated in SOURCE_LINE_NUMBER which correctly should be 3.2.

Now when Interface Trip Stop runs it completes and closes the trip as well as posts entries to inventory, when the Oracle turn to close order line it does not get proper reference in table WSH_DELIVERY_DETAILS and hence it fails to complete other operations on order line. The result of it is order line remains open even if everything is complete. Workflow background engine also plays important role in posting entries in different application / tables.

To fix this issue we need to feed the correct information to Oracle through a script, and then run program so that in again processes the record/s correctly and closes the line/s.

Solution: Correct data in two WSH_DELIVERY_DETAILS and WSH_TRIP_STOPS tables and run Interface Trip Stop (Most of the time this program is scheduled)

  1. Fix data in table WSH_DELIVERY_DETAILS .

    1. This requires to know LINE_ID and line number of the line for which issue explained above had occurred. Run the script blow to find LINE_ID

select

to_number(ool.line_number'.'ool.shipment_number) line_number,

ooh.order_number,

ool.line_id,

ool.split_from_line_id,

msi.segment1 item_number,

ool.ordered_quantity ord_qty,

ool.cancelled_quantity can_qty,

ool.shipped_quantity shp_qty,

ool.flow_status_code,

ool.org_id

from apps.MTL_SYSTEM_ITEMS msi,

apps.OE_ORDER_LINES_ALL ool,

apps.OE_ORDER_HEADERS_ALL ooh

where ooh.order_number = &order_num

and ooh.org_id in (:org_id)

and ool.header_id = ooh.header_id

and msi.organization_id in (:master_organization_id)

and msi.inventory_item_id = ool.inventory_item_id

order by to_number(ool.line_number'.'ool.shipment_number);

    1. Also is DELIVERY_DETAIL_ID is required which can be obtained by running script below and pick the appropriate DELIVERY_DETAIL_ID depending on ITEM and its quantity. Most probably; correct line number or LINE_ID you will not get here, it will be of parent line from which the line got a split so ignore and looks for on ITEM and its quantity. Here note DELIVERY_ID also which will help to find STOP_ID from table WSH_TRIP_STOPS

select wnd.DELIVERY_ID,

wnd.status_code st,

wda.delivery_assignment_id wdai,

wdd.delivery_detail_id ,

wdd.move_order_line_id moli,

to_number(ool.line_number'.'ool.shipment_number) ola_ln,

wdd.source_line_number wdd_ln,

wdd.source_line_id linid,

wdd.released_status rs,

wdd.inv_interfaced_flag iv,

wdd.oe_interfaced_flag io,

wdd.requested_quantity reqqty,

wdd.cancelled_quantity canqty,

wdd.shipped_quantity shpqty,

wdd.cycle_count_quantity boqty,

wdd.split_from_delivery_detail_id split,

msi.segment1 item

from apps.MTL_SYSTEM_ITEMS msi,

apps.OE_ORDER_LINES_ALL ool,

apps.OE_ORDER_HEADERS_ALL ooh,

apps.WSH_DELIVERY_DETAILS wdd,

apps.WSH_DELIVERY_ASSIGNMENTS wda,

apps.WSH_NEW_DELIVERIES wnd

where ooh.order_number = &order_num

and ooh.org_id in (:org_id)

and ool.header_id = ooh.header_id

and msi.organization_id in (:master_organization_id)

and msi.inventory_item_id = ool.inventory_item_id

and wdd.source_header_id = ooh.header_id

and wdd.source_line_id = ool.line_id

and wda.delivery_detail_id(+) = wdd.delivery_detail_id

and wnd.delivery_id(+) = wda.delivery_id

order by ola_ln;

    1. Update script for table WSH_DELIVERY_DETAILS

UPDATE WSH_DELIVERY_DETAILS

SET SOURCE_LINE_ID = :LINE_ID,

SOURCE_LINE_NUMBER = ':LINE_NUMBER',

OE_INTERFACED_FLAG = 'N',

LAST_UPDATED_BY = :User_ID,

LAST_UPDATE_DATE = SYSDATE

WHERE DELIVERY_DETAIL_ID = :delivery_detail_id;

  1. Fix data in table WSH_TRIP_STOPS.

    1. Use the DELIVERY_ID from above step (i.e. 1-b) in the script below to find STOP_ID. Looks for relevant data and catch the STOP_ID.

select wt.trip_id trip,

wt.status_code sc,

oola.line_id,

wtsp.stop_id ,

wtsp.status_code sc,

wtsp.pending_interface_flag pif,

wtsd.status_code sc,

wtsd.pending_interface_flag pif,

wtsd.lock_stop_id lockid,

wnd.organization_id org,

wnd.status_code,

wnd.planned_flag pf,

wdd.delivery_detail_id wddi,

wdd.released_status rs,

wdd.inv_interfaced_flag inv,

wdd.oe_interfaced_flag oe,

ooha.order_number, to_number(nvl(oola.line_number,0)'.'nvl(oola.shipment_number,0)) ln,

oola.flow_status_code flow_status,

oola.org_id ordou,

wdd.org_id wddou,

wdd.source_code sc,

wdd.source_line_id line_id,

wdd.source_header_id hdr_id

from apps.wsh_new_deliveries wnd,

apps.wsh_delivery_assignments wda,

apps.wsh_delivery_details wdd,

apps.oe_order_headers_all ooha,

apps.oe_order_lines_all oola,

apps.wsh_delivery_legs wdl,

apps.wsh_trip_stops wtsp,

apps.wsh_trip_stops wtsd,

apps.wsh_trips wt

where wnd.delivery_id = :delivery_id

and wda.delivery_id(+) = wnd.delivery_id

and wdd.delivery_detail_id(+) = wda.delivery_detail_id

and ooha.header_id(+) = wdd.source_header_id

and oola.line_id(+) = wdd.source_line_id

and wdl.delivery_id (+) = wnd.delivery_id

and wtsp.stop_id (+) = wdl.pick_up_stop_id

and wtsd.stop_id (+) = wdl.drop_off_stop_id

and wt.trip_id (+) = wtsp.trip_id

order by ooha.order_number, oola.line_number;

Else the other way to find STOP_ID is, you can open Shipping transaction form and query the order number. There find the trip number in Path by Trip tab.

cid:image007.jpg@01CA5257.188EBB00

Then find associated STOP_ID.

    1. Update script for table WSH_TRIP_STOPS

UPDATE WSH_TRIP_STOPS

SET PENDING_INTERFACE_FLAG = 'Y',

LAST_UPDATED_BY = :USER_ID,

LAST_UPDATE_DATE = SYSDATE

WHERE STOP_ID = :STOP_ID;

  1. Run Interface Trip Stop program.

If you are running this program and entries are already posted to inventory then run the program with mode ‘OM and DSNO’, provide the trip name which you get from Shipping Transaction screen. (It really does not matter a lot as the inventory interfaced flag for the record is already set to YES, so it is not going to update the inventory again)

For better visibility CLICK HERE

Monday, August 17, 2009

File import in Customer Data Hub (CDH)

You can load customer data using file import or bulk port.

Bulk import is the process of loading customer data in bulk from legacy, third party, or other external sources into the TCA Registry. In Oracle Customers Online, you can load source files of organization, person, or contact data into the TCA import interface tables.

For the transfer of data from the interface tables into the TCA Registry, you have these options.
• If the appropriate import profile options are set, the file load data can automatically transfer from the interface tables into the Registry, as soon as the file load completes successfully.
• Use the Oracle Customer Data Librarian (CDL) batch import feature, if you have CDL.
• Run the Import Batch to TCA Registry program using the Trading Community Manager responsibility.

The bulk import is only for information on the party level. To import both party and account information, need to use Customer Interface Process.

An import batch contains information about transferring a set of records from the interface tables into the TCA Registry, including settings for optional data quality pre-import processes that you can perform on the batch.
Batch de-duplication: Identify and resolve duplicates within the batch that you are importing from the interface tables.
Address validation: Validate addresses within the batch against authorized sources.
Registry de-duplication: Identify and resolve duplicates between the interface tables and the Registry.
Import process is basically consists of below main steps:



====================================================

In this article we are going to see file import for “Person” type of parties.

Prepare a CSV file with details of the persons you want to import. Here I am going to show you import for two records, with their contact and organization details.

Let’s prepare the CSV file:


First Name,Last name,Address 1,Address 2,City,Country,Email,Gender,Marital Status, Organization Name,Birth Date
Christian,Slator,99 A-Towers,,SouthHall,GB,pqr@eds.com,MALE,S,AC Networks,1-Jan-1975
DEVendra,Gulve,Rakshak Nagar,Kharadi,Pune,India,d@gulve.com,MALE,S,TATA Motors,11-Jan-1999


Logon to Oracle and go to any of the below responsibility:Customer Data Librarian
Customer Online

Provide the details for import. Also specify the format in which data exist in CSV file. If the first row in CSV file has field name / column heading then select ‘File Header Exist’.
Here we are going to import person’s information. Select the information type which you want to import. For each type, some mandatory data must be there in CSV file.
E.g Address: Address1, Country. Contact: Atleast one contact (Phone/email)








Here Columns from CSV file must be mapped with Oracle column names. Usually import is done in specific format (depends on Organizational need), so you can save mapping and use it each time to reduce efforts.
Also one more very good function is given by Oracle “Auto Map”. If Oracle’s Column name and column name in CSV file matches, then those columns gets auto-mapped. This function works only when header information exists in source file.


Oracle will do Auto-mapping only for those columns for which names matches. For rest of column we need to do mapping manually.


Search for appropriate column name and provide mapping as shown below. You can change column name in CSV file to make this process hassle free .


Once the Mapping is done, have a look at Preview and then click on Finish.


Oracle fires one Concurrent request to import data in Interface table. If any error occurs then you can correct it.
Oops!!! We encountered error/s L
Let’s go and correct it.


Here it looks there is some problem in record for First Name=DEVendra


Here looks some problem with value we provide for column Country.


Import program need Country code, but we provided Country name. So let’s correct it to country code (IN) and apply changes.


One more column I am aware of is Gender it does not take M / F, we need to give MALE / FEMALE that too in all caps J .
Also for Marital Status attribute, Married / Single will not work, Oracle need values to be M / S.
Now the obvious questions come to mind that
How we can to know which value is valid for which column?
Is there any document which tells us mandatory columns for import?
I have only one answer for it, this will come to know by experience only. I did not find any document for such kind of question. Or you can search for different look-ups being used for respective attributes.

Now we can finish the error correction process.


If you are going to put data in interface tables directly using SQL, hen it will need an Active Batch. So before using SQL you will need to “Generate Batch” and then “Activate Batch”.
In our case batch is already created and it is in active status. Also errors are corrected, so now the batch is ready to be imported to base tables.
Select Batch and Define Import for the batch.


Here you can specify whether you want to check duplicates in batch or not.


Here you can specify whether you want to validate Addresses in batch or not.


Here are some more options to make import more efficient and accurate.

If you select 'Preview Result Before Running Import' then Oracle will finish only stage-1 and then you need to view details and after you click on 'Complete Import' further processing will be done.
If you select to check of duplicate then you also need to specify the match rule which needs to be used.
As we have not selected for checking duplicates, Oracle create entities even if same entity already exist.

Oracle fires concurrent program which will put records from interface table to base tables.
Keep Refreshing page till the time it completes.


Now the Import status will keep changing based on the stages that are completed.
On completion of stage 1, table named HZ_IMP_PARTIES_SG will get populated with the records.
On completion of stage 2, related tables having table name like HZ_IMP_%_SG gets populated. All the tables having suffix _SG are the staging tables which holds values in import process and same information is pushed to Interface tables (e.g. HZ_IMP_PARTIES_INT).
In stage 3 data gets populated in interface table and then processed to push it to base tables (e.g. HZ_PARTIES).

If any error comes during interface, then details will be available in table HZ_IMP_ERROR.
You can search for records having error with condition INTERFACE_STATUS ='E' in any interface table and then get the associated ERROR_ID. Search for that ERROR_ID in table HZ_IMP_ERROR, here you come to know which column in which table has value causing the error.



Hurry!!! Our program got completed without error.
We added only two records in CSV file but Oracle is showing Total 10 records got imported… how come these many records got imported?
Even if only two records were in CSV file, Oracle entered 5 records for each person,
i.e. Person Name, Address, Contact (Email), Organization, Relationship between Organization-Person.

That makes total imported records =10.

Let’s check whether the correct details we imported …
Query the person:


As we gave Organization name, oracle created organization and build relationship between them.











Below are some of the tables getting used in Import process:


































































Entity Interface Table TCA Table
Addresses HZ_IMP_ADDRESSES_INT HZ_LOCATIONS
HZ_PARTY_SITES
Address Uses HZ_IMP_ADDRESSUSES_INTHZ_PARTY_SITE_USES
Classifications HZ_IMP_CLASSIFICS_INT HZ_CODE_ASSIGNMENTS
Credit Ratings HZ_IMP_CREDITRTNGS_INTHZ_CREDIT_RATINGS
Contact Points HZ_IMP_CONTACTPTS_INTHZ_CONTACT_POINTS
Financial Numbers HZ_IMP_FINNUMBERS_INT HZ_FINANCIAL_NUMBERS
Financial Reports HZ_IMP_FINREPORTS_INT HZ_FINANCIAL_REPORTS
Parties HZ_IMP_PARTIES_INT HZ_PARTIES
HZ_PERSON_PROFILES
HZ_ORGANIZATION_PROFILES
Relationships HZ_IMP_RELSHIPS_INT HZ_RELATIONSHIPS
HZ_ORG_CONTACTS
HZ_PARTIES
ContactsHZ_IMP_CONTACTS_INT HZ_RELATIONSHIPS
HZ_ORG_CONTACTS
HZ_PARTIES
Contact Roles HZ_IMP_CONTACTROLES_INTHZ_ORG_CONTACT_ROLES

For better visibility CLICK HERE

Please add your valuable feedback / comments

Declaimer:

This blog is purely personal and the thoughts expressed here represent only me. The purpose of this blog is to share information and knowledge about Oracle's product which I have come across with my exposure to the product, practice and observations. The blog has been created keeping only one intention of sharing knowledge and for learning purpose. The blog has been created solely as a educational, for storing portions of the vast Oracle knowledge world. Oracle EBS is an Oracle Corp. product and you should contact Oracle directly for any specific fact or issue.

*NOTE: Few articles on this blog are not completely prepared by me, content is edited and complied after referring various sites to make visitor's job easy